Search Results

Documents authored by Marx, Dániel


Found 2 Possible Name Variants:

Marx, Dániel

Document
Approximate Monotone Local Search for Weighted Problems

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
In a recent work, Esmer et al. describe a simple method - Approximate Monotone Local Search - to obtain exponential approximation algorithms from existing parameterized exact algorithms, polynomial-time approximation algorithms and, more generally, parameterized approximation algorithms. In this work, we generalize those results to the weighted setting. More formally, we consider monotone subset minimization problems over a weighted universe of size n (e.g., Vertex Cover, d-Hitting Set and Feedback Vertex Set). We consider a model where the algorithm is only given access to a subroutine that finds a solution of weight at most α ⋅ W (and of arbitrary cardinality) in time c^k ⋅ n^{𝒪(1)} where W is the minimum weight of a solution of cardinality at most k. In the unweighted setting, Esmer et al. determine the smallest value d for which a β-approximation algorithm running in time dⁿ ⋅ n^{𝒪(1)} can be obtained in this model. We show that the same dependencies also hold in a weighted setting in this model: for every fixed ε > 0 we obtain a β-approximation algorithm running in time 𝒪((d+ε)ⁿ), for the same d as in the unweighted setting. Similarly, we also extend a β-approximate brute-force search (in a model which only provides access to a membership oracle) to the weighted setting. Using existing approximation algorithms and exact parameterized algorithms for weighted problems, we obtain the first exponential-time β-approximation algorithms that are better than brute force for a variety of problems including Weighted Vertex Cover, Weighted d-Hitting Set, Weighted Feedback Vertex Set and Weighted Multicut.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Approximate Monotone Local Search for Weighted Problems. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 17:1-17:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2023.17,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Approximate Monotone Local Search for Weighted Problems}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{17:1--17:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.17},
  URN =		{urn:nbn:de:0030-drops-194360},
  doi =		{10.4230/LIPIcs.IPEC.2023.17},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
Computing Generalized Convolutions Faster Than Brute Force

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In this paper, we consider a general notion of convolution. Let D be a finite domain and let Dⁿ be the set of n-length vectors (tuples) of D. Let f : D × D → D be a function and let ⊕_f be a coordinate-wise application of f. The f-Convolution of two functions g,h : Dⁿ → {-M,…,M} is (g ⊛_f h)(v) := ∑_{v_g,v_h ∈ D^n s.t. v = v_g ⊕_f v_h} g(v_g) ⋅ h(v_h) for every 𝐯 ∈ Dⁿ. This problem generalizes many fundamental convolutions such as Subset Convolution, XOR Product, Covering Product or Packing Product, etc. For arbitrary function f and domain D we can compute f-Convolution via brute-force enumeration in 𝒪̃(|D|^{2n} ⋅ polylog(M)) time. Our main result is an improvement over this naive algorithm. We show that f-Convolution can be computed exactly in 𝒪̃((c ⋅ |D|²)ⁿ ⋅ polylog(M)) for constant c := 5/6 when D has even cardinality. Our main observation is that a cyclic partition of a function f : D × D → D can be used to speed up the computation of f-Convolution, and we show that an appropriate cyclic partition exists for every f. Furthermore, we demonstrate that a single entry of the f-Convolution can be computed more efficiently. In this variant, we are given two functions g,h : Dⁿ → {-M,…,M} alongside with a vector 𝐯 ∈ Dⁿ and the task of the f-Query problem is to compute integer (g ⊛_f h)(𝐯). This is a generalization of the well-known Orthogonal Vectors problem. We show that f-Query can be computed in 𝒪̃(|D|^{(ω/2)n} ⋅ polylog(M)) time, where ω ∈ [2,2.373) is the exponent of currently fastest matrix multiplication algorithm.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki. Computing Generalized Convolutions Faster Than Brute Force. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 12:1-12:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2022.12,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Schepper, Philipp and W\k{e}grzycki, Karol},
  title =	{{Computing Generalized Convolutions Faster Than Brute Force}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{12:1--12:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.12},
  URN =		{urn:nbn:de:0030-drops-173685},
  doi =		{10.4230/LIPIcs.IPEC.2022.12},
  annote =	{Keywords: Generalized Convolution, Fast Fourier Transform, Fast Subset Convolution}
}
Document
Domination and Cut Problems on Chordal Graphs with Bounded Leafage

Authors: Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
The leafage of a chordal graph G is the minimum integer 𝓁 such that G can be realized as an intersection graph of subtrees of a tree with 𝓁 leaves. We consider structural parameterization by the leafage of classical domination and cut problems on chordal graphs. Fomin, Golovach, and Raymond [ESA 2018, Algorithmica 2020] proved, among other things, that Dominating Set on chordal graphs admits an algorithm running in time 2^𝒪(𝓁²) ⋅ n^𝒪(1). We present a conceptually much simpler algorithm that runs in time 2^𝒪(𝓁) ⋅ n^𝒪(1). We extend our approach to obtain similar results for Connected Dominating Set and Steiner Tree. We then consider the two classical cut problems MultiCut with Undeletable Terminals and Multiway Cut with Undeletable Terminals. We prove that the former is W[1]-hard when parameterized by the leafage and complement this result by presenting a simple n^𝒪(𝓁)-time algorithm. To our surprise, we find that Multiway Cut with Undeletable Terminals on chordal graphs can be solved, in contrast, in n^O(1)-time.

Cite as

Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale. Domination and Cut Problems on Chordal Graphs with Bounded Leafage. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 14:1-14:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.IPEC.2022.14,
  author =	{Galby, Esther and Marx, D\'{a}niel and Schepper, Philipp and Sharma, Roohani and Tale, Prafullkumar},
  title =	{{Domination and Cut Problems on Chordal Graphs with Bounded Leafage}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{14:1--14:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.14},
  URN =		{urn:nbn:de:0030-drops-173704},
  doi =		{10.4230/LIPIcs.IPEC.2022.14},
  annote =	{Keywords: Chordal Graphs, Leafage, FPT Algorithms, Dominating Set, MultiCut with Undeletable Terminals, Multiway Cut with Undeletable Terminals}
}
Document
Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In the general AntiFactor problem, a graph G and, for every vertex v of G, a set X_v ⊆ ℕ of forbidden degrees is given. The task is to find a set S of edges such that the degree of v in S is not in the set X_v. Standard techniques (dynamic programming plus fast convolution) can be used to show that if M is the largest forbidden degree, then the problem can be solved in time (M+2)^{tw}⋅n^{O(1)} if a tree decomposition of width tw is given. However, significantly faster algorithms are possible if the sets X_v are sparse: our main algorithmic result shows that if every vertex has at most x forbidden degrees (we call this special case AntiFactor_x), then the problem can be solved in time (x+1)^{O(tw)}⋅n^{O(1)}. That is, AntiFactor_x is fixed-parameter tractable parameterized by treewidth tw and the maximum number x of excluded degrees. Our algorithm uses the technique of representative sets, which can be generalized to the optimization version, but (as expected) not to the counting version of the problem. In fact, we show that #AntiFactor₁ is already #W[1]-hard parameterized by the width of the given decomposition. Moreover, we show that, unlike for the decision version, the standard dynamic programming algorithm is essentially optimal for the counting version. Formally, for a fixed nonempty set X, we denote by X-AntiFactor the special case where every vertex v has the same set X_v = X of forbidden degrees. We show the following lower bound for every fixed set X: if there is an ε > 0 such that #X-AntiFactor can be solved in time (max X+2-ε)^{tw}⋅n^{O(1)} given a tree decomposition of width tw, then the Counting Strong Exponential-Time Hypothesis (#SETH) fails.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard). In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 22:1-22:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.IPEC.2022.22,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{22:1--22:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.22},
  URN =		{urn:nbn:de:0030-drops-173780},
  doi =		{10.4230/LIPIcs.IPEC.2022.22},
  annote =	{Keywords: Anti-Factor, General Factor, Treewidth, Representative Sets, SETH}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)

Authors: Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný

Published in: Dagstuhl Reports, Volume 12, Issue 5 (2022)


Abstract
Constraint satisfaction has always played a central role in computational complexity theory; appropriate versions of CSPs are classical complete problems for most standard complexity classes. CSPs constitute a very rich and yet sufficiently manageable class of problems to give a good perspective on general computational phenomena. For instance, they help to understand which mathematical properties make a computational problem tractable (in a wide sense, e.g., polynomial-time solvable, non-trivially approximable, fixed-parameter tractable, or definable in a weak logic). In the last 15 years, research activity in this area has significantly intensified and hugely impressive progress was made. The Dagstuhl Seminar 22201 "The Constraint Satisfaction Problem: Complexity and Approximability" was aimed at bringing together researchers using all the different techniques in the study of the CSP so that they can share their insights obtained during the past four years. This report documents the material presented during the course of the seminar.

Cite as

Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201). In Dagstuhl Reports, Volume 12, Issue 5, pp. 112-130, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{grohe_et_al:DagRep.12.5.112,
  author =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)}},
  pages =	{112--130},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2022},
  volume =	{12},
  number =	{5},
  editor =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.12.5.112},
  URN =		{urn:nbn:de:0030-drops-174453},
  doi =		{10.4230/DagRep.12.5.112},
  annote =	{Keywords: Constraint satisfaction problem (CSP); Computational complexity; Hardness of approximation; Universal algebra; Semidefinite programming}
}
Document
Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
We generalize the monotone local search approach of Fomin, Gaspers, Lokshtanov and Saurabh [J.ACM 2019], by establishing a connection between parameterized approximation and exponential-time approximation algorithms for monotone subset minimization problems. In a monotone subset minimization problem the input implicitly describes a non-empty set family over a universe of size n which is closed under taking supersets. The task is to find a minimum cardinality set in this family. Broadly speaking, we use approximate monotone local search to show that a parameterized α-approximation algorithm that runs in c^k⋅n^𝒪(1) time, where k is the solution size, can be used to derive an α-approximation randomized algorithm that runs in dⁿ⋅n^𝒪(1) time, where d is the unique value in (1, 1+{c-1}/α) such that 𝒟(1/α‖{d-1}/{c-1}) = {ln c}/α and 𝒟(a‖b) is the Kullback-Leibler divergence. This running time matches that of Fomin et al. for α = 1, and is strictly better when α > 1, for any c > 1. Furthermore, we also show that this result can be derandomized at the expense of a sub-exponential multiplicative factor in the running time. We use an approximate variant of the exhaustive search as a benchmark for our algorithm. We show that the classic 2ⁿ⋅n^𝒪(1) exhaustive search can be adapted to an α-approximate exhaustive search that runs in time (1+exp(-α⋅ℋ(1/(α))))ⁿ⋅n^𝒪(1), where ℋ is the entropy function. Furthermore, we provide a lower bound stating that the running time of this α-approximate exhaustive search is the best achievable running time in an oracle model. When compared to approximate exhaustive search, and to other techniques, the running times obtained by approximate monotone local search are strictly better for any α ≥ 1, c > 1. We demonstrate the potential of approximate monotone local search by deriving new and faster exponential approximation algorithms for Vertex Cover, 3-Hitting Set, Directed Feedback Vertex Set, Directed Subset Feedback Vertex Set, Directed Odd Cycle Transversal and Undirected Multicut. For instance, we get a 1.1-approximation algorithm for Vertex Cover with running time 1.114ⁿ⋅n^𝒪(1), improving upon the previously best known 1.1-approximation running in time 1.127ⁿ⋅n^𝒪(1) by Bourgeois et al. [DAM 2011].

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 50:1-50:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.ESA.2022.50,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{50:1--50:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.50},
  URN =		{urn:nbn:de:0030-drops-169887},
  doi =		{10.4230/LIPIcs.ESA.2022.50},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves

Authors: Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
The Dynamic Time Warping (DTW) distance is a popular measure of similarity for a variety of sequence data. For comparing polygonal curves π, σ in ℝ^d, it provides a robust, outlier-insensitive alternative to the Fréchet distance. However, like the Fréchet distance, the DTW distance is not invariant under translations. Can we efficiently optimize the DTW distance of π and σ under arbitrary translations, to compare the curves' shape irrespective of their absolute location? There are surprisingly few works in this direction, which may be due to its computational intricacy: For the Euclidean norm, this problem contains as a special case the geometric median problem, which provably admits no exact algebraic algorithm (that is, no algorithm using only addition, multiplication, and k-th roots). We thus investigate exact algorithms for non-Euclidean norms as well as approximation algorithms for the Euclidean norm. For the L₁ norm in ℝ^d, we provide an 𝒪(n^{2(d+1)})-time algorithm, i.e., an exact polynomial-time algorithm for constant d. Here and below, n bounds the curves' complexities. For the Euclidean norm in ℝ², we show that a simple problem-specific insight leads to a (1+ε)-approximation in time 𝒪(n³/ε²). We then show how to obtain a subcubic 𝒪̃(n^{2.5}/ε²) time algorithm with significant new ideas; this time comes close to the well-known quadratic time barrier for computing DTW for fixed translations. Technically, the algorithm is obtained by speeding up repeated DTW distance estimations using a dynamic data structure for maintaining shortest paths in weighted planar digraphs. Crucially, we show how to traverse a candidate set of translations using space-filling curves in a way that incurs only few updates to the data structure. We hope that our results will facilitate the use of DTW under translation both in theory and practice, and inspire similar algorithmic approaches for related geometric optimization problems.

Cite as

Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser. Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 20:1-20:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.SoCG.2022.20,
  author =	{Bringmann, Karl and Kisfaludi‑Bak, S\'{a}ndor and K\"{u}nnemann, Marvin and Marx, D\'{a}niel and Nusser, Andr\'{e}},
  title =	{{Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{20:1--20:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.20},
  URN =		{urn:nbn:de:0030-drops-160287},
  doi =		{10.4230/LIPIcs.SoCG.2022.20},
  annote =	{Keywords: Dynamic Time Warping, Sequence Similarity Measures}
}
Document
Track A: Algorithms, Complexity and Games
Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
In the General Factor problem, we are given an undirected graph G and for each vertex v ∈ V(G) a finite set B_v of non-negative integers. The task is to decide if there is a subset S ⊆ E(G) such that deg_S(v) ∈ B_v for all vertices v of G. Define the max-gap of a finite integer set B to be the largest d ≥ 0 such that there is an a ≥ 0 with [a,a+d+1] ∩ B = {a,a+d+1}. Cornuéjols showed in 1988 that if the max-gap of all sets B_v is at most 1, then the decision version of General Factor is polynomial-time solvable. This result was extended 2018 by Dudycz and Paluch for the optimization (i.e. minimization and maximization) versions. We present a general algorithm counting the number of solutions of a certain size in time #2 (M+1)^{tw}^{𝒪(1)}, given a tree decomposition of width tw, where M is the maximum integer over all B_v. By using convolution techniques from van Rooij (2020), we improve upon the previous (M+1)^{3tw}^𝒪(1) time algorithm by Arulselvan et al. from 2018. We prove that this algorithm is essentially optimal for all cases that are not trivial or polynomial time solvable for the decision, minimization or maximization versions. Our lower bounds show that such an improvement is not even possible for B-Factor, which is General Factor on graphs where all sets B_v agree with the fixed set B. We show that for every fixed B where the problem is NP-hard, our (max B+1)^tw^𝒪(1) algorithm cannot be significantly improved: assuming the Strong Exponential Time Hypothesis (SETH), no algorithm can solve B-Factor in time (max B+1-ε)^tw^𝒪(1) for any ε > 0. We extend this bound to the counting version of B-Factor for arbitrary, non-trivial sets B, assuming #SETH. We also investigate the parameterization of the problem by cutwidth. Unlike for treewidth, having a larger set B does not appear to make the problem harder: we give a 2^cutw^𝒪(1) algorithm for any B and provide a matching lower bound that this is optimal for the NP-hard cases.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 95:1-95:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2021.95,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{95:1--95:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.95},
  URN =		{urn:nbn:de:0030-drops-141647},
  doi =		{10.4230/LIPIcs.ICALP.2021.95},
  annote =	{Keywords: General Factor, General Matching, Treewidth, Cutwidth}
}
Document
On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting

Authors: Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram

Published in: LIPIcs, Volume 192, 2nd Symposium on Foundations of Responsible Computing (FORC 2021)


Abstract
Redistricting is the problem of dividing up a state into a given number k of regions (called districts) where the voters in each district are to elect a representative. The three primary criteria are: that each district be connected, that the populations of the districts be equal (or nearly equal), and that the districts are "compact". There are multiple competing definitions of compactness, usually minimizing some quantity. One measure that has been recently been used is number of cut edges. In this formulation of redistricting, one is given atomic regions out of which each district must be built (e.g., in the U.S., census blocks). The populations of the atomic regions are given. Consider the graph with one vertex per atomic region and an edge between atomic regions with a shared boundary of positive length. Define the weight of a vertex to be the population of the corresponding region. A districting plan is a partition of vertices into k pieces so that the parts have nearly equal weights and each part is connected. The districts are considered compact to the extent that the plan minimizes the number of edges crossing between different parts. There are two natural computational problems: find the most compact districting plan, and sample districting plans (possibly under a compactness constraint) uniformly at random. Both problems are NP-hard so we consider restricting the input graph to have branchwidth at most w. (A planar graph’s branchwidth is bounded, for example, by its diameter.) If both k and w are bounded by constants, the problems are solvable in polynomial time. In this paper, we give lower and upper bounds that characterize the complexity of these problems in terms of parameters k and w. For simplicity of notation, assume that each vertex has unit weight. We would ideally like algorithms whose running times are of the form O(f(k,w) n^c) for some constant c independent of k and w (in which case the problems are said to be fixed-parameter tractable with respect to those parameters). We show that, under standard complexity-theoretic assumptions, no such algorithms exist. However, the problems are fixed-parameter tractable with respect to each of these parameters individually: there exist algorithms with running times of the form O(f(k) n^{O(w)}) and O(f(w) n^{k+1}). The first result was previously known. The new one, however, is more relevant to the application to redistricting, at least for coarse instances. Indeed, we have implemented a version of the algorithm and have used to successfully find optimally compact solutions to all redistricting instances for France (except Paris, which operates under different rules) under various population-balance constraints. For these instances, the values for w are modest and the values for k are very small.

Cite as

Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram. On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting. In 2nd Symposium on Foundations of Responsible Computing (FORC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 192, pp. 3:1-3:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cohenaddad_et_al:LIPIcs.FORC.2021.3,
  author =	{Cohen-Addad, Vincent and Klein, Philip N. and Marx, D\'{a}niel and Wheeler, Archer and Wolfram, Christopher},
  title =	{{On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting}},
  booktitle =	{2nd Symposium on Foundations of Responsible Computing (FORC 2021)},
  pages =	{3:1--3:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-187-0},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{192},
  editor =	{Ligett, Katrina and Gupta, Swati},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FORC.2021.3},
  URN =		{urn:nbn:de:0030-drops-138718},
  doi =		{10.4230/LIPIcs.FORC.2021.3},
  annote =	{Keywords: redistricting, algorithms, planar graphs, lower bounds}
}
Document
Chordless Cycle Packing Is Fixed-Parameter Tractable

Authors: Dániel Marx

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
A chordless cycle or hole in a graph G is an induced cycle of length at least 4. In the Hole Packing problem, a graph G and an integer k is given, and the task is to find (if exists) a set of k pairwise vertex-disjoint chordless cycles. Our main result is showing that Hole Packing is fixed-parameter tractable (FPT), that is, can be solved in time f(k)n^O(1) for some function f depending only on k.

Cite as

Dániel Marx. Chordless Cycle Packing Is Fixed-Parameter Tractable. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 71:1-71:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.ESA.2020.71,
  author =	{Marx, D\'{a}niel},
  title =	{{Chordless Cycle Packing Is Fixed-Parameter Tractable}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{71:1--71:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.71},
  URN =		{urn:nbn:de:0030-drops-129373},
  doi =		{10.4230/LIPIcs.ESA.2020.71},
  annote =	{Keywords: chordal graphs, packing, fixed-parameter tractability}
}
Document
Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy

Authors: Dániel Marx and R. B. Sandeep

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
Given a graph G and an integer k, the H-free Edge Editing problem is to find whether there exist at most k pairs of vertices in G such that changing the adjacency of the pairs in G results in a graph without any induced copy of H. The existence of polynomial kernels for H-free Edge Editing (that is, whether it is possible to reduce the size of the instance to k^O(1) in polynomial time) received significant attention in the parameterized complexity literature. Nontrivial polynomial kernels are known to exist for some graphs H with at most 4 vertices (e.g., path on 3 or 4 vertices, diamond, paw), but starting from 5 vertices, polynomial kernels are known only if H is either complete or empty. This suggests the conjecture that there is no other H with at least 5 vertices were H-free Edge Editing admits a polynomial kernel. Towards this goal, we obtain a set ℋ of nine 5-vertex graphs such that if for every H ∈ ℋ, H-free Edge Editing is incompressible and the complexity assumption NP ⊈ coNP/poly holds, then H-free Edge Editing is incompressible for every graph H with at least five vertices that is neither complete nor empty. That is, proving incompressibility for these nine graphs would give a complete classification of the kernelization complexity of H-free Edge Editing for every H with at least 5 vertices. We obtain similar result also for H-free Edge Deletion. Here the picture is more complicated due to the existence of another infinite family of graphs H where the problem is trivial (graphs with exactly one edge). We obtain a larger set ℋ of nineteen graphs whose incompressibility would give a complete classification of the kernelization complexity of H-free Edge Deletion for every graph H with at least 5 vertices. Analogous results follow also for the H-free Edge Completion problem by simple complementation.

Cite as

Dániel Marx and R. B. Sandeep. Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 72:1-72:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ESA.2020.72,
  author =	{Marx, D\'{a}niel and Sandeep, R. B.},
  title =	{{Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{72:1--72:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.72},
  URN =		{urn:nbn:de:0030-drops-129383},
  doi =		{10.4230/LIPIcs.ESA.2020.72},
  annote =	{Keywords: incompressibility, edge modification problems, H-free graphs}
}
Document
Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction

Authors: Marvin Künnemann and Dániel Marx

Published in: LIPIcs, Volume 169, 35th Computational Complexity Conference (CCC 2020)


Abstract
To study the question under which circumstances small solutions can be found faster than by exhaustive search (and by how much), we study the fine-grained complexity of Boolean constraint satisfaction with size constraint exactly k. More precisely, we aim to determine, for any finite constraint family, the optimal running time f(k)n^g(k) required to find satisfying assignments that set precisely k of the n variables to 1. Under central hardness assumptions on detecting cliques in graphs and 3-uniform hypergraphs, we give an almost tight characterization of g(k) into four regimes: 1) Brute force is essentially best-possible, i.e., g(k) = (1 ± o(1))k, 2) the best algorithms are as fast as current k-clique algorithms, i.e., g(k) = (ω/3 ± o(1))k, 3) the exponent has sublinear dependence on k with g(k) ∈ [Ω(∛k), O(√k)], or 4) the problem is fixed-parameter tractable, i.e., g(k) = O(1). This yields a more fine-grained perspective than a previous FPT/W[1]-hardness dichotomy (Marx, Computational Complexity 2005). Our most interesting technical contribution is a f(k)n^(4√k)-time algorithm for SubsetSum with precedence constraints parameterized by the target k - particularly the approach, based on generalizing a bound on the Frobenius coin problem to a setting with precedence constraints, might be of independent interest.

Cite as

Marvin Künnemann and Dániel Marx. Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction. In 35th Computational Complexity Conference (CCC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 169, pp. 27:1-27:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kunnemann_et_al:LIPIcs.CCC.2020.27,
  author =	{K\"{u}nnemann, Marvin and Marx, D\'{a}niel},
  title =	{{Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction}},
  booktitle =	{35th Computational Complexity Conference (CCC 2020)},
  pages =	{27:1--27:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-156-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{169},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2020.27},
  URN =		{urn:nbn:de:0030-drops-125791},
  doi =		{10.4230/LIPIcs.CCC.2020.27},
  annote =	{Keywords: Fine-grained complexity theory, algorithmic classification theorem, multivariate algorithms and complexity, constraint satisfaction problems, satisfiability}
}
Document
Track A: Algorithms, Complexity and Games
Hitting Long Directed Cycles Is Fixed-Parameter Tractable

Authors: Alexander Göke, Dániel Marx, and Matthias Mnich

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
In the Directed Long Cycle Hitting Set problem we are given a directed graph G, and the task is to find a set S of at most k vertices/arcs such that G-S has no cycle of length longer than ℓ. We show that the problem can be solved in time 2^O(ℓ^6 + ℓ k^3 log k + k^5 log k log ℓ) ⋅ n^O(1), that is, it is fixed-parameter tractable (FPT) parameterized by k and ℓ. This algorithm can be seen as a far-reaching generalization of the fixed-parameter tractability of Mixed Graph Feedback Vertex Set [Bonsma and Lokshtanov WADS 2011], which is already a common generalization of the fixed-parameter tractability of (undirected) Feedback Vertex Set and the Directed Feedback Vertex Set problems, two classic results in parameterized algorithms. The algorithm requires significant insights into the structure of graphs without directed cycles of length longer than ℓ and can be seen as an exact version of the approximation algorithm following from the Erdős-Pósa property for long cycles in directed graphs proved by Kreutzer and Kawarabayashi [STOC 2015].

Cite as

Alexander Göke, Dániel Marx, and Matthias Mnich. Hitting Long Directed Cycles Is Fixed-Parameter Tractable. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 59:1-59:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{goke_et_al:LIPIcs.ICALP.2020.59,
  author =	{G\"{o}ke, Alexander and Marx, D\'{a}niel and Mnich, Matthias},
  title =	{{Hitting Long Directed Cycles Is Fixed-Parameter Tractable}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{59:1--59:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.59},
  URN =		{urn:nbn:de:0030-drops-124664},
  doi =		{10.4230/LIPIcs.ICALP.2020.59},
  annote =	{Keywords: Directed graphs, directed feedback vertex set, circumference}
}
Document
Finding and Counting Permutations via CSPs

Authors: Benjamin Aram Berendsohn, László Kozma, and Dániel Marx

Published in: LIPIcs, Volume 148, 14th International Symposium on Parameterized and Exact Computation (IPEC 2019)


Abstract
Permutation patterns and pattern avoidance have been intensively studied in combinatorics and computer science, going back at least to the seminal work of Knuth on stack-sorting (1968). Perhaps the most natural algorithmic question in this area is deciding whether a given permutation of length n contains a given pattern of length k. In this work we give two new algorithms for this well-studied problem, one whose running time is n^{k/4 + o(k)}, and a polynomial-space algorithm whose running time is the better of O(1.6181^n) and O(n^{k/2 + 1}). These results improve the earlier best bounds of n^{0.47k + o(k)} and O(1.79^n) due to Ahal and Rabinovich (2000) resp. Bruner and Lackner (2012) and are the fastest algorithms for the problem when k in Omega(log{n}). We show that both our new algorithms and the previous exponential-time algorithms in the literature can be viewed through the unifying lens of constraint-satisfaction. Our algorithms can also count, within the same running time, the number of occurrences of a pattern. We show that this result is close to optimal: solving the counting problem in time f(k) * n^{o(k/log{k})} would contradict the exponential-time hypothesis (ETH). For some special classes of patterns we obtain improved running times. We further prove that 3-increasing and 3-decreasing permutations can, in some sense, embed arbitrary permutations of almost linear length, which indicates that an algorithm with sub-exponential running time is unlikely, even for patterns from these restricted classes.

Cite as

Benjamin Aram Berendsohn, László Kozma, and Dániel Marx. Finding and Counting Permutations via CSPs. In 14th International Symposium on Parameterized and Exact Computation (IPEC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 148, pp. 1:1-1:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{berendsohn_et_al:LIPIcs.IPEC.2019.1,
  author =	{Berendsohn, Benjamin Aram and Kozma, L\'{a}szl\'{o} and Marx, D\'{a}niel},
  title =	{{Finding and Counting Permutations via CSPs}},
  booktitle =	{14th International Symposium on Parameterized and Exact Computation (IPEC 2019)},
  pages =	{1:1--1:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-129-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{148},
  editor =	{Jansen, Bart M. P. and Telle, Jan Arne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2019.1},
  URN =		{urn:nbn:de:0030-drops-114627},
  doi =		{10.4230/LIPIcs.IPEC.2019.1},
  annote =	{Keywords: permutations, pattern matching, constraint satisfaction, exponential time}
}
Document
Parameterized Streaming Algorithms for Min-Ones d-SAT

Authors: Akanksha Agrawal, Arindam Biswas, Édouard Bonnet, Nick Brettell, Radu Curticapean, Dániel Marx, Tillmann Miltzow, Venkatesh Raman, and Saket Saurabh

Published in: LIPIcs, Volume 150, 39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019)


Abstract
In this work, we initiate the study of the Min-Ones d-SAT problem in the parameterized streaming model. An instance of the problem consists of a d-CNF formula F and an integer k, and the objective is to determine if F has a satisfying assignment which sets at most k variables to 1. In the parameterized streaming model, input is provided as a stream, just as in the usual streaming model. A key difference is that the bound on the read-write memory available to the algorithm is O(f(k) log n) (f: N -> N, a computable function) as opposed to the O(log n) bound of the usual streaming model. The other important difference is that the number of passes the algorithm makes over its input must be a (preferably small) function of k. We design a (k + 1)-pass parameterized streaming algorithm that solves Min-Ones d-SAT (d >= 2) using space O((kd^(ck) + k^d)log n) (c > 0, a constant) and a (d + 1)^k-pass algorithm that uses space O(k log n). We also design a streaming kernelization for Min-Ones 2-SAT that makes (k + 2) passes and uses space O(k^6 log n) to produce a kernel with O(k^6) clauses. To complement these positive results, we show that any k-pass algorithm for or Min-Ones d-SAT (d >= 2) requires space Omega(max{n^(1/k) / 2^k, log(n / k)}) on instances (F, k). This is achieved via a reduction from the streaming problem POT Pointer Chasing (Guha and McGregor [ICALP 2008]), which might be of independent interest. Given this, our (k + 1)-pass parameterized streaming algorithm is the best possible, inasmuch as the number of passes is concerned. In contrast to the results of Fafianie and Kratsch [MFCS 2014] and Chitnis et al. [SODA 2015], who independently showed that there are 1-pass parameterized streaming algorithms for Vertex Cover (a restriction of Min-Ones 2-SAT), we show using lower bounds from Communication Complexity that for any d >= 1, a 1-pass streaming algorithm for Min-Ones d-SAT requires space Omega(n). This excludes the possibility of a 1-pass parameterized streaming algorithm for the problem. Additionally, we show that any p-pass algorithm for the problem requires space Omega(n/p).

Cite as

Akanksha Agrawal, Arindam Biswas, Édouard Bonnet, Nick Brettell, Radu Curticapean, Dániel Marx, Tillmann Miltzow, Venkatesh Raman, and Saket Saurabh. Parameterized Streaming Algorithms for Min-Ones d-SAT. In 39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 150, pp. 8:1-8:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{agrawal_et_al:LIPIcs.FSTTCS.2019.8,
  author =	{Agrawal, Akanksha and Biswas, Arindam and Bonnet, \'{E}douard and Brettell, Nick and Curticapean, Radu and Marx, D\'{a}niel and Miltzow, Tillmann and Raman, Venkatesh and Saurabh, Saket},
  title =	{{Parameterized Streaming Algorithms for Min-Ones d-SAT}},
  booktitle =	{39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019)},
  pages =	{8:1--8:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-131-3},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{150},
  editor =	{Chattopadhyay, Arkadev and Gastin, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2019.8},
  URN =		{urn:nbn:de:0030-drops-115708},
  doi =		{10.4230/LIPIcs.FSTTCS.2019.8},
  annote =	{Keywords: min, ones, sat, d-sat, parameterized, kernelization, streaming, space, efficient, algorithm, parameter}
}
Document
How Does Object Fatness Impact the Complexity of Packing in d Dimensions?

Authors: Sándor Kisfaludi-Bak, Dániel Marx, and Tom C. van der Zanden

Published in: LIPIcs, Volume 149, 30th International Symposium on Algorithms and Computation (ISAAC 2019)


Abstract
Packing is a classical problem where one is given a set of subsets of Euclidean space called objects, and the goal is to find a maximum size subset of objects that are pairwise non-intersecting. The problem is also known as the Independent Set problem on the intersection graph defined by the objects. Although the problem is NP-complete, there are several subexponential algorithms in the literature. One of the key assumptions of such algorithms has been that the objects are fat, with a few exceptions in two dimensions; for example, the packing problem of a set of polygons in the plane surprisingly admits a subexponential algorithm. In this paper we give tight running time bounds for packing similarly-sized non-fat objects in higher dimensions. We propose an alternative and very weak measure of fatness called the stabbing number, and show that the packing problem in Euclidean space of constant dimension d >=slant 3 for a family of similarly sized objects with stabbing number alpha can be solved in 2^O(n^(1-1/d) alpha) time. We prove that even in the case of axis-parallel boxes of fixed shape, there is no 2^o(n^(1-1/d) alpha) algorithm under ETH. This result smoothly bridges the whole range of having constant-fat objects on one extreme (alpha=1) and a subexponential algorithm of the usual running time, and having very "skinny" objects on the other extreme (alpha=n^(1/d)), where we cannot hope to improve upon the brute force running time of 2^O(n), and thereby characterizes the impact of fatness on the complexity of packing in case of similarly sized objects. We also study the same problem when parameterized by the solution size k, and give a n^O(k^(1-1/d) alpha) algorithm, with an almost matching lower bound: there is no algorithm with running time of the form f(k) n^o(k^(1-1/d) alpha/log k) under ETH. One of our main tools in these reductions is a new wiring theorem that may be of independent interest.

Cite as

Sándor Kisfaludi-Bak, Dániel Marx, and Tom C. van der Zanden. How Does Object Fatness Impact the Complexity of Packing in d Dimensions?. In 30th International Symposium on Algorithms and Computation (ISAAC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 149, pp. 36:1-36:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kisfaludibak_et_al:LIPIcs.ISAAC.2019.36,
  author =	{Kisfaludi-Bak, S\'{a}ndor and Marx, D\'{a}niel and van der Zanden, Tom C.},
  title =	{{How Does Object Fatness Impact the Complexity of Packing in d Dimensions?}},
  booktitle =	{30th International Symposium on Algorithms and Computation (ISAAC 2019)},
  pages =	{36:1--36:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-130-6},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{149},
  editor =	{Lu, Pinyan and Zhang, Guochuan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2019.36},
  URN =		{urn:nbn:de:0030-drops-115327},
  doi =		{10.4230/LIPIcs.ISAAC.2019.36},
  annote =	{Keywords: Geometric intersection graph, Independent Set, Object fatness}
}
Document
New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041)

Authors: Fedor V. Fomin, Dániel Marx, Saket Saurabh, and Meirav Zehavi

Published in: Dagstuhl Reports, Volume 9, Issue 1 (2019)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 19041 "New Horizons in Parameterized Complexity". Parameterized Complexity is celebrating its 30th birthday in 2019. In these three decades, there has been tremendous progress in developing the area. The central vision of Parameterized Complexity through all these years has been to provide the algorithmic and complexity-theoretic toolkit for studying multivariate algorithmics in different disciplines and subfields of Computer Science. These tools are universal as they did not only help in the development of the core of Parameterized Complexity, but also led to its success in other subfields of Computer Science such as Approximation Algorithms, Computational Social Choice, Computational Geometry, problems solvable in P (polynomial time), to name a few. In the last few years, we have witnessed several exciting developments of new parameterized techniques and tools in the following subfields of Computer Science and Optimization: Mathematical Programming, Computational Linear Algebra, Computational Counting, Derandomization, and Approximation Algorithms. The main objective of the seminar was to initiate the discussion on which of the recent domain-specific algorithms and complexity advances can become useful in other domains.

Cite as

Fedor V. Fomin, Dániel Marx, Saket Saurabh, and Meirav Zehavi. New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041). In Dagstuhl Reports, Volume 9, Issue 1, pp. 67-87, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@Article{fomin_et_al:DagRep.9.1.67,
  author =	{Fomin, Fedor V. and Marx, D\'{a}niel and Saurabh, Saket and Zehavi, Meirav},
  title =	{{New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041)}},
  pages =	{67--87},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2019},
  volume =	{9},
  number =	{1},
  editor =	{Fomin, Fedor V. and Marx, D\'{a}niel and Saurabh, Saket and Zehavi, Meirav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.9.1.67},
  URN =		{urn:nbn:de:0030-drops-105706},
  doi =		{10.4230/DagRep.9.1.67},
  annote =	{Keywords: Intractability, Parameterized Complexity}
}
Document
Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs

Authors: Vincent Cohen-Addad, Éric Colin de Verdière, Dániel Marx, and Arnaud de Mesmay

Published in: LIPIcs, Volume 129, 35th International Symposium on Computational Geometry (SoCG 2019)


Abstract
We prove essentially tight lower bounds, conditionally to the Exponential Time Hypothesis, for two fundamental but seemingly very different cutting problems on surface-embedded graphs: the Shortest Cut Graph problem and the Multiway Cut problem. A cut graph of a graph G embedded on a surface S is a subgraph of G whose removal from S leaves a disk. We consider the problem of deciding whether an unweighted graph embedded on a surface of genus g has a cut graph of length at most a given value. We prove a time lower bound for this problem of n^{Omega(g/log g)} conditionally to ETH. In other words, the first n^{O(g)}-time algorithm by Erickson and Har-Peled [SoCG 2002, Discr. Comput. Geom. 2004] is essentially optimal. We also prove that the problem is W[1]-hard when parameterized by the genus, answering a 17-year old question of these authors. A multiway cut of an undirected graph G with t distinguished vertices, called terminals, is a set of edges whose removal disconnects all pairs of terminals. We consider the problem of deciding whether an unweighted graph G has a multiway cut of weight at most a given value. We prove a time lower bound for this problem of n^{Omega(sqrt{gt + g^2}/log(gt))}, conditionally to ETH, for any choice of the genus g >=0 of the graph and the number of terminals t >=4. In other words, the algorithm by the second author [Algorithmica 2017] (for the more general multicut problem) is essentially optimal; this extends the lower bound by the third author [ICALP 2012] (for the planar case). Reductions to planar problems usually involve a grid-like structure. The main novel idea for our results is to understand what structures instead of grids are needed if we want to exploit optimally a certain value g of the genus.

Cite as

Vincent Cohen-Addad, Éric Colin de Verdière, Dániel Marx, and Arnaud de Mesmay. Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs. In 35th International Symposium on Computational Geometry (SoCG 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 129, pp. 27:1-27:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{cohenaddad_et_al:LIPIcs.SoCG.2019.27,
  author =	{Cohen-Addad, Vincent and Colin de Verdi\`{e}re, \'{E}ric and Marx, D\'{a}niel and de Mesmay, Arnaud},
  title =	{{Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs}},
  booktitle =	{35th International Symposium on Computational Geometry (SoCG 2019)},
  pages =	{27:1--27:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-104-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{129},
  editor =	{Barequet, Gill and Wang, Yusu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2019.27},
  URN =		{urn:nbn:de:0030-drops-104311},
  doi =		{10.4230/LIPIcs.SoCG.2019.27},
  annote =	{Keywords: Cut graph, Multiway cut, Surface, Lower bound, Parameterized Complexity, Exponential Time Hypothesis}
}
Document
Multi-Budgeted Directed Cuts

Authors: Stefan Kratsch, Shaohua Li, Dániel Marx, Marcin Pilipczuk, and Magnus Wahlström

Published in: LIPIcs, Volume 115, 13th International Symposium on Parameterized and Exact Computation (IPEC 2018)


Abstract
In this paper, we study multi-budgeted variants of the classic minimum cut problem and graph separation problems that turned out to be important in parameterized complexity: Skew Multicut and Directed Feedback Arc Set. In our generalization, we assign colors 1,2,...,l to some edges and give separate budgets k_1,k_2,...,k_l for colors 1,2,...,l. For every color i in {1,...,l}, let E_i be the set of edges of color i. The solution C for the multi-budgeted variant of a graph separation problem not only needs to satisfy the usual separation requirements (i.e., be a cut, a skew multicut, or a directed feedback arc set, respectively), but also needs to satisfy that |C cap E_i| <= k_i for every i in {1,...,l}. Contrary to the classic minimum cut problem, the multi-budgeted variant turns out to be NP-hard even for l = 2. We propose FPT algorithms parameterized by k=k_1 +...+ k_l for all three problems. To this end, we develop a branching procedure for the multi-budgeted minimum cut problem that measures the progress of the algorithm not by reducing k as usual, by but elevating the capacity of some edges and thus increasing the size of maximum source-to-sink flow. Using the fact that a similar strategy is used to enumerate all important separators of a given size, we merge this process with the flow-guided branching and show an FPT bound on the number of (appropriately defined) important multi-budgeted separators. This allows us to extend our algorithm to the Skew Multicut and Directed Feedback Arc Set problems. Furthermore, we show connections of the multi-budgeted variants with weighted variants of the directed cut problems and the Chain l-SAT problem, whose parameterized complexity remains an open problem. We show that these problems admit a bounded-in-parameter number of "maximally pushed" solutions (in a similar spirit as important separators are maximally pushed), giving somewhat weak evidence towards their tractability.

Cite as

Stefan Kratsch, Shaohua Li, Dániel Marx, Marcin Pilipczuk, and Magnus Wahlström. Multi-Budgeted Directed Cuts. In 13th International Symposium on Parameterized and Exact Computation (IPEC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 115, pp. 18:1-18:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kratsch_et_al:LIPIcs.IPEC.2018.18,
  author =	{Kratsch, Stefan and Li, Shaohua and Marx, D\'{a}niel and Pilipczuk, Marcin and Wahlstr\"{o}m, Magnus},
  title =	{{Multi-Budgeted Directed Cuts}},
  booktitle =	{13th International Symposium on Parameterized and Exact Computation (IPEC 2018)},
  pages =	{18:1--18:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-084-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{115},
  editor =	{Paul, Christophe and Pilipczuk, Michal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2018.18},
  URN =		{urn:nbn:de:0030-drops-102194},
  doi =		{10.4230/LIPIcs.IPEC.2018.18},
  annote =	{Keywords: important separators, multi-budgeted cuts, Directed Feedback Vertex Set, fixed-parameter tractability, minimum cut}
}
Document
Complete Volume
LIPIcs, Volume 107, ICALP'18, Complete Volume

Authors: Ioannis Chatzigiannakis, Christos Kaklamanis, Dániel Marx, and Donald Sannella

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
LIPIcs, Volume 107, ICALP'18, Complete Volume

Cite as

45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@Proceedings{chatzigiannakis_et_al:LIPIcs.ICALP.2018,
  title =	{{LIPIcs, Volume 107, ICALP'18, Complete Volume}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018},
  URN =		{urn:nbn:de:0030-drops-92803},
  doi =		{10.4230/LIPIcs.ICALP.2018},
  annote =	{Keywords: Theory of computation}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Ioannis Chatzigiannakis, Christos Kaklamanis, Dániel Marx, and Donald Sannella

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 0:i-0:xlviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chatzigiannakis_et_al:LIPIcs.ICALP.2018.0,
  author =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{0:i--0:xlviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.0},
  URN =		{urn:nbn:de:0030-drops-90049},
  doi =		{10.4230/LIPIcs.ICALP.2018.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
The Parameterized Hardness of the k-Center Problem in Transportation Networks

Authors: Andreas Emil Feldmann and Dániel Marx

Published in: LIPIcs, Volume 101, 16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018)


Abstract
In this paper we study the hardness of the k-Center problem on inputs that model transportation networks. For the problem, an edge-weighted graph G=(V,E) and an integer k are given and a center set C subseteq V needs to be chosen such that |C|<= k. The aim is to minimize the maximum distance of any vertex in the graph to the closest center. This problem arises in many applications of logistics, and thus it is natural to consider inputs that model transportation networks. Such inputs are often assumed to be planar graphs, low doubling metrics, or bounded highway dimension graphs. For each of these models, parameterized approximation algorithms have been shown to exist. We complement these results by proving that the k-Center problem is W[1]-hard on planar graphs of constant doubling dimension, where the parameter is the combination of the number of centers k, the highway dimension h, and even the treewidth t. Moreover, under the Exponential Time Hypothesis there is no f(k,t,h)* n^{o(t+sqrt{k+h})} time algorithm for any computable function f. Thus it is unlikely that the optimum solution to k-Center can be found efficiently, even when assuming that the input graph abides to all of the above models for transportation networks at once! Additionally we give a simple parameterized (1+{epsilon})-approximation algorithm for inputs of doubling dimension d with runtime (k^k/{epsilon}^{O(kd)})* n^{O(1)}. This generalizes a previous result, which considered inputs in D-dimensional L_q metrics.

Cite as

Andreas Emil Feldmann and Dániel Marx. The Parameterized Hardness of the k-Center Problem in Transportation Networks. In 16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 101, pp. 19:1-19:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{feldmann_et_al:LIPIcs.SWAT.2018.19,
  author =	{Feldmann, Andreas Emil and Marx, D\'{a}niel},
  title =	{{The Parameterized Hardness of the k-Center Problem in Transportation Networks}},
  booktitle =	{16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018)},
  pages =	{19:1--19:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-068-2},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{101},
  editor =	{Eppstein, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2018.19},
  URN =		{urn:nbn:de:0030-drops-88450},
  doi =		{10.4230/LIPIcs.SWAT.2018.19},
  annote =	{Keywords: k-center, parameterized complexity, planar graphs, doubling dimension, highway dimension, treewidth}
}
Document
Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms

Authors: Édouard Bonnet, Nick Brettell, O-joung Kwon, and Dániel Marx

Published in: LIPIcs, Volume 89, 12th International Symposium on Parameterized and Exact Computation (IPEC 2017)


Abstract
It has long been known that Feedback Vertex Set can be solved in time 2^O(w log w)n^O(1) on graphs of treewidth w, but it was only recently that this running time was improved to 2^O(w)n^O(1), that is, to single-exponential parameterized by treewidth. We investigate which generalizations of Feedback Vertex Set can be solved in a similar running time. Formally, for a class of graphs P, Bounded P-Block Vertex Deletion asks, given a graph G on n vertices and positive integers k and d, whether G contains a set S of at most k vertices such that each block of G-S has at most d vertices and is in P. Assuming that P is recognizable in polynomial time and satisfies a certain natural hereditary condition, we give a sharp characterization of when single-exponential parameterized algorithms are possible for fixed values of d: - if P consists only of chordal graphs, then the problem can be solved in time 2^O(wd^2) n^{O}(1), - if P contains a graph with an induced cycle of length ell>= 4, then the problem is not solvable in time 2^{o(w log w)} n^O(1)} even for fixed d=ell, unless the ETH fails. We also study a similar problem, called Bounded P-Component Vertex Deletion, where the target graphs have connected components of small size instead of having blocks of small size, and present analogous results.

Cite as

Édouard Bonnet, Nick Brettell, O-joung Kwon, and Dániel Marx. Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms. In 12th International Symposium on Parameterized and Exact Computation (IPEC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 89, pp. 7:1-7:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.IPEC.2017.7,
  author =	{Bonnet, \'{E}douard and Brettell, Nick and Kwon, O-joung and Marx, D\'{a}niel},
  title =	{{Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms}},
  booktitle =	{12th International Symposium on Parameterized and Exact Computation (IPEC 2017)},
  pages =	{7:1--7:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-051-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{89},
  editor =	{Lokshtanov, Daniel and Nishimura, Naomi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2017.7},
  URN =		{urn:nbn:de:0030-drops-85653},
  doi =		{10.4230/LIPIcs.IPEC.2017.7},
  annote =	{Keywords: fixed-parameter tractable algorithms, treewidth, feedback vertex set}
}
Document
Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization

Authors: László Egri, Dániel Marx, and Pawel Rzazewski

Published in: LIPIcs, Volume 96, 35th Symposium on Theoretical Aspects of Computer Science (STACS 2018)


Abstract
In the list homomorphism problem, the input consists of two graphs G and H, together with a list L(v) \subseteq V(H) for every vertex v \in V(G). The task is to find a homomorphism phi:V(G) -> V(H) respecting the lists, that is, we have that phi(v) \in L(v) for every v \in V(H) and if u and v are adjacent in G, then phi(u) and phi(v) are adjacent in H. If H is a fixed graph, then the problem is denoted LHom(H). We consider the reflexive version of the problem, where we assume that every vertex in H has a self-loop. If is known that reflexive LHom(H) is polynomial-time solvable if H is an interval graph and it is NP-complete otherwise [Feder and Hell, JCTB 1998]. We explore the complexity of the problem parameterized by the treewidth tw(G) of the input graph G. If a tree decomposition of G of width tw(G) is given in the input, then the problem can be solved in time |V(H)|^{tw(G)} n^{O(1)} by naive dynamic programming. Our main result completely reveals when and by exactly how much this naive algorithm can be improved. We introduce a simple combinatorial invariant i^*(H), which is based on the existence of decompositions and incomparable sets, and show that this number should appear as the base of the exponent in the best possible running time. Specifically, we prove for every fixed non-interval graph H that * If a tree decomposition of width tw(G) is given in the input, then the problem can be solved in time i^*(H)^{tw(G)} n^{O(1)}. * Assuming the Strong Exponential-Time Hypothesis (SETH), the probem cannot be solved in time (i^*(H)-epsilon)^{tw(G)} n^{O(1)} for any epsilon>0. Thus by matching upper and lower bounds, our result exactly characterizes for every fixed H the complexity of reflexive LHom(H) parameterized by treewidth.

Cite as

László Egri, Dániel Marx, and Pawel Rzazewski. Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization. In 35th Symposium on Theoretical Aspects of Computer Science (STACS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 96, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{egri_et_al:LIPIcs.STACS.2018.27,
  author =	{Egri, L\'{a}szl\'{o} and Marx, D\'{a}niel and Rzazewski, Pawel},
  title =	{{Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization}},
  booktitle =	{35th Symposium on Theoretical Aspects of Computer Science (STACS 2018)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-062-0},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{96},
  editor =	{Niedermeier, Rolf and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2018.27},
  URN =		{urn:nbn:de:0030-drops-84867},
  doi =		{10.4230/LIPIcs.STACS.2018.27},
  annote =	{Keywords: graph homomorphism, list homomorphism, reflexive graph, treewidth}
}
Document
Subexponential Parameterized Algorithms for Graphs of Polynomial Growth

Authors: Dániel Marx and Marcin Pilipczuk

Published in: LIPIcs, Volume 87, 25th Annual European Symposium on Algorithms (ESA 2017)


Abstract
We show that for a number of parameterized problems for which only 2^{O(k)} n^{O(1)} time algorithms are known on general graphs, subexponential parameterized algorithms with running time 2^{O(k^{1-1/(1+d)} log^2 k)} n^{O(1)} are possible for graphs of polynomial growth with growth rate (degree) d, that is, if we assume that every ball of radius r contains only O(r^d) vertices. The algorithms use the technique of low-treewidth pattern covering, introduced by Fomin et al. [FOCS 2016] for planar graphs; here we show how this strategy can be made to work for graphs of polynomial growth. Formally, we prove that, given a graph G of polynomial growth with growth rate d and an integer k, one can in randomized polynomial time find a subset A of V(G) such that on one hand the treewidth of G[A] is O(k^{1-1/(1+d)} log k), and on the other hand for every set X of vertices of size at most k, the probability that X is a subset of A is 2^{-O(k^{1-1/(1+d)} log^2 k)}. Together with standard dynamic programming techniques on graphs of bounded treewidth, this statement gives subexponential parameterized algorithms for a number of subgraph search problems, such as Long Path or Steiner Tree, in graphs of polynomial growth. We complement the algorithm with an almost tight lower bound for Long Path: unless the Exponential Time Hypothesis fails, no parameterized algorithm with running time 2^{k^{1-1/d-epsilon}}n^{O(1)} is possible for any positive epsilon and any integer d >= 3.

Cite as

Dániel Marx and Marcin Pilipczuk. Subexponential Parameterized Algorithms for Graphs of Polynomial Growth. In 25th Annual European Symposium on Algorithms (ESA 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 87, pp. 59:1-59:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ESA.2017.59,
  author =	{Marx, D\'{a}niel and Pilipczuk, Marcin},
  title =	{{Subexponential Parameterized Algorithms for Graphs of Polynomial Growth}},
  booktitle =	{25th Annual European Symposium on Algorithms (ESA 2017)},
  pages =	{59:1--59:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-049-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{87},
  editor =	{Pruhs, Kirk and Sohler, Christian},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2017.59},
  URN =		{urn:nbn:de:0030-drops-78162},
  doi =		{10.4230/LIPIcs.ESA.2017.59},
  annote =	{Keywords: polynomial growth, subexponential algorithm, low treewidth pattern covering}
}
Document
Fine-Grained Complexity of Coloring Unit Disks and Balls

Authors: Csaba Biró, Édouard Bonnet, Dániel Marx, Tillmann Miltzow, and Pawel Rzazewski

Published in: LIPIcs, Volume 77, 33rd International Symposium on Computational Geometry (SoCG 2017)


Abstract
On planar graphs, many classic algorithmic problems enjoy a certain "square root phenomenon" and can be solved significantly faster than what is known to be possible on general graphs: for example, Independent Set, 3-Coloring, Hamiltonian Cycle, Dominating Set can be solved in time 2^O(sqrt{n}) on an n-vertex planar graph, while no 2^o(n) algorithms exist for general graphs, assuming the Exponential Time Hypothesis (ETH). The square root in the exponent seems to be best possible for planar graphs: assuming the ETH, the running time for these problems cannot be improved to 2^o(sqrt{n}). In some cases, a similar speedup can be obtained for 2-dimensional geometric problems, for example, there are 2^O(sqrt{n}log n) time algorithms for Independent Set on unit disk graphs or for TSP on 2-dimensional point sets. In this paper, we explore whether such a speedup is possible for geometric coloring problems. On the one hand, geometric objects can behave similarly to planar graphs: 3-Coloring can be solved in time 2^O(sqrt{n}) on the intersection graph of n unit disks in the plane and, assuming the ETH, there is no such algorithm with running time 2^o(sqrt{n}). On the other hand, if the number L of colors is part of the input, then no such speedup is possible: Coloring the intersection graph of n unit disks with L colors cannot be solved in time 2^o(n), assuming the ETH. More precisely, we exhibit a smooth increase of complexity as the number L of colors increases: If we restrict the number of colors to L=Theta(n^alpha) for some 0<=alpha<=1, then the problem of coloring the intersection graph of n unit disks with L colors * can be solved in time exp(O(n^{{1+alpha}/2}log n))=exp( O(sqrt{nL}log n)), and * cannot be solved in time exp(o(n^{{1+alpha}/2}))=exp(o(sqrt{nL})), unless the ETH fails. More generally, we consider the problem of coloring d-dimensional unit balls in the Euclidean space and obtain analogous results showing that the problem * can be solved in time exp(O(n^{{d-1+alpha}/d}log n))=exp(O(n^{1-1/d}L^{1/d}log n)), and * cannot be solved in time exp(n^{{d-1+alpha}/d-epsilon})= exp (O(n^{1-1/d-epsilon}L^{1/d})) for any epsilon>0, unless the ETH fails.

Cite as

Csaba Biró, Édouard Bonnet, Dániel Marx, Tillmann Miltzow, and Pawel Rzazewski. Fine-Grained Complexity of Coloring Unit Disks and Balls. In 33rd International Symposium on Computational Geometry (SoCG 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 77, pp. 18:1-18:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{biro_et_al:LIPIcs.SoCG.2017.18,
  author =	{Bir\'{o}, Csaba and Bonnet, \'{E}douard and Marx, D\'{a}niel and Miltzow, Tillmann and Rzazewski, Pawel},
  title =	{{Fine-Grained Complexity of Coloring Unit Disks and Balls}},
  booktitle =	{33rd International Symposium on Computational Geometry (SoCG 2017)},
  pages =	{18:1--18:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-038-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{77},
  editor =	{Aronov, Boris and Katz, Matthew J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2017.18},
  URN =		{urn:nbn:de:0030-drops-71800},
  doi =		{10.4230/LIPIcs.SoCG.2017.18},
  annote =	{Keywords: unit disk graphs, unit ball graphs, coloring, exact algorithm}
}
Document
Invited Talk
Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries (Invited Talk)

Authors: Dániel Marx

Published in: LIPIcs, Volume 68, 20th International Conference on Database Theory (ICDT 2017)


Abstract
The complexity of evaluating conjunctive queries can depend significantly on the structure of the query. For example, it is well known that various notions of acyclicity can make the evaluation problem tractable. More generally, it seems that the complexity is connected to the "treelikeness" of the graph or hypergraph describing the query structure. In the lecture, we will review some of the notions of treelikeness that were proposed in the literature and how they are relevant for the complexity of evaluating conjunctive queries and related problems.

Cite as

Dániel Marx. Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries (Invited Talk). In 20th International Conference on Database Theory (ICDT 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 68, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.ICDT.2017.2,
  author =	{Marx, D\'{a}niel},
  title =	{{Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries}},
  booktitle =	{20th International Conference on Database Theory (ICDT 2017)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-024-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{68},
  editor =	{Benedikt, Michael and Orsi, Giorgio},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2017.2},
  URN =		{urn:nbn:de:0030-drops-70652},
  doi =		{10.4230/LIPIcs.ICDT.2017.2},
  annote =	{Keywords: Conjunctive queries, treewidth, complexity}
}
Document
Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix

Authors: Lin Chen, Dániel Marx, Deshi Ye, and Guochuan Zhang

Published in: LIPIcs, Volume 66, 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)


Abstract
We study approximation and parameterized algorithms for R||C_max, focusing on the problem when the rank of the matrix formed by job processing times is small. Bhaskara et al. initiated the study of approximation algorithms with respect to the rank, showing that R||C_max admits a QPTAS (Quasi-polynomial time approximation scheme) when the rank is 2, and becomes APX-hard when the rank is 4. We continue this line of research. We prove that R||C_max is APX-hard even if the rank is 3, resolving an open problem. We then show that R||C_max is FPT parameterized by the rank and the largest job processing time p_max. This generalizes the parameterized results on P||C_max and R||C_max with few different types of machines. We also provide nearly tight lower bounds under Exponential Time Hypothesis which suggests that the running time of the FPT algorithm is unlikely to be improved significantly.

Cite as

Lin Chen, Dániel Marx, Deshi Ye, and Guochuan Zhang. Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix. In 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 66, pp. 22:1-22:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.STACS.2017.22,
  author =	{Chen, Lin and Marx, D\'{a}niel and Ye, Deshi and Zhang, Guochuan},
  title =	{{Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix}},
  booktitle =	{34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)},
  pages =	{22:1--22:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-028-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{66},
  editor =	{Vollmer, Heribert and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2017.22},
  URN =		{urn:nbn:de:0030-drops-70110},
  doi =		{10.4230/LIPIcs.STACS.2017.22},
  annote =	{Keywords: APX-hardness, Parameterized algorithm, Scheduling, Exponential Time Hypothesis}
}
Document
H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms

Authors: Gábor Bacsó, Dániel Marx, and Zsolt Tuza

Published in: LIPIcs, Volume 63, 11th International Symposium on Parameterized and Exact Computation (IPEC 2016)


Abstract
It is an outstanding open question in algorithmic graph theory to determine the complexity of the MAXIMUM INDEPENDENT SET problem on P_t-free graphs, that is, on graphs not containing any induced path on t vertices. So far, polynomial-time algorithms are known only for t at most 5 [Lokshtanov et al., SODA 2014, 570-581, 2014]. Here we study the existence of subexponential-time algorithms for the problem: by generalizing an earlier result of Randerath and Schiermeyer for t=5 [Discrete App. Math., 158 (2010), 1041-1044], we show that for any t at least 5, there is an algorithm for MAXIMUM INDEPENDENT SET on P_t-free graphs whose running time is subexponential in the number of vertices. SCATTERED SET is the generalization of MAXIMUM INDEPENDENT SET where the vertices of the solution are required to be at distance at least $d$ from each other. We give a complete characterization of those graphs H for which SCATTERED SET on H-free graphs can be solved in time subexponential in the size of the input (that is, in the number of vertices plus the number of edges): * If every component of H is a path, then d-SCATTERED SET on H-free graphs with n vertices and m edges can be solved in time 2^{(n+m)^{1-O(1/|V(H)|)}}, even if d is part of the input. * Otherwise, assuming ETH, there is no 2^{o(n+m)} time algorithm for d-SCATTERED SET for any fixed d at least 3 on H-free graphs with n vertices and m edges.

Cite as

Gábor Bacsó, Dániel Marx, and Zsolt Tuza. H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms. In 11th International Symposium on Parameterized and Exact Computation (IPEC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 63, pp. 3:1-3:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{bacso_et_al:LIPIcs.IPEC.2016.3,
  author =	{Bacs\'{o}, G\'{a}bor and Marx, D\'{a}niel and Tuza, Zsolt},
  title =	{{H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms}},
  booktitle =	{11th International Symposium on Parameterized and Exact Computation (IPEC 2016)},
  pages =	{3:1--3:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-023-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{63},
  editor =	{Guo, Jiong and Hermelin, Danny},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2016.3},
  URN =		{urn:nbn:de:0030-drops-69397},
  doi =		{10.4230/LIPIcs.IPEC.2016.3},
  annote =	{Keywords: independent set, scattered set, subexponential algorithms, H-free graphs}
}
Document
Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221)

Authors: Jeff Erickson, Philip N. Klein, Dániel Marx, and Claire Mathieu

Published in: Dagstuhl Reports, Volume 6, Issue 5 (2016)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 16221 “Algorithms for Optimization Problems in Planar Graphs”. The seminar was held from May 29 to June 3, 2016. This report contains abstracts for the recent developments in planar graph algorithms discussed during the seminar as well as summaries of open problems in this area of research.

Cite as

Jeff Erickson, Philip N. Klein, Dániel Marx, and Claire Mathieu. Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221). In Dagstuhl Reports, Volume 6, Issue 5, pp. 94-113, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{erickson_et_al:DagRep.6.5.94,
  author =	{Erickson, Jeff and Klein, Philip N. and Marx, D\'{a}niel and Mathieu, Claire},
  title =	{{Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221)}},
  pages =	{94--113},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2016},
  volume =	{6},
  number =	{5},
  editor =	{Erickson, Jeff and Klein, Philip N. and Marx, D\'{a}niel and Mathieu, Claire},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.6.5.94},
  URN =		{urn:nbn:de:0030-drops-67227},
  doi =		{10.4230/DagRep.6.5.94},
  annote =	{Keywords: Algorithms, planar graphs, theory, approximation, fixed-parameter tractable, network flow, network design, kernelization}
}
Document
Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs

Authors: Dániel Marx, Ario Salmasi, and Anastasios Sidiropoulos

Published in: LIPIcs, Volume 60, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)


Abstract
In the Asymmetric Traveling Salesperson Problem (ATSP) the goal is to find a closed walk of minimum cost in a directed graph visiting every vertex. We consider the approximability of ATSP on topologically restricted graphs. It has been shown by Oveis Gharan and Saberi [SODA, 2011] that there exists polynomial-time constant-factor approximations on planar graphs and more generally graphs of constant orientable genus. This result was extended to non-orientable genus by Erickson and Sidiropoulos [SoCG, 2014]. We show that for any class of nearly-embeddable graphs, ATSP admits a polynomial-time constant-factor approximation. More precisely, we show that for any fixed non-negative k, there exist positive alpha and beta, such that ATSP on n-vertex k-nearly-embeddable graphs admits an alpha-approximation in time O(n^beta). The class of k-nearly-embeddable graphs contains graphs with at most k apices, k vortices of width at most k, and an underlying surface of either orientable or non-orientable genus at most k. Prior to our work, even the case of graphs with a single apex was open. Our algorithm combines tools from rounding the Held-Karp LP via thin trees with dynamic programming. We complement our upper bounds by showing that solving ATSP exactly on graphs of pathwidth k (and hence on k-nearly embeddable graphs) requires time n^{Omega(k)}, assuming the Exponential-Time Hypothesis (ETH). This is surprising in light of the fact that both TSP on undirected graphs and Minimum Cost Hamiltonian Cycle on directed graphs are FPT parameterized by treewidth.

Cite as

Dániel Marx, Ario Salmasi, and Anastasios Sidiropoulos. Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 60, pp. 16:1-16:54, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.APPROX-RANDOM.2016.16,
  author =	{Marx, D\'{a}niel and Salmasi, Ario and Sidiropoulos, Anastasios},
  title =	{{Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)},
  pages =	{16:1--16:54},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-018-7},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{60},
  editor =	{Jansen, Klaus and Mathieu, Claire and Rolim, Jos\'{e} D. P. and Umans, Chris},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2016.16},
  URN =		{urn:nbn:de:0030-drops-66391},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2016.16},
  annote =	{Keywords: asymmetric TSP, approximation algorithms, nearly-embeddable graphs, Held-Karp LP, exponential time hypothesis}
}
Document
The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems

Authors: Andreas Emil Feldmann and Dániel Marx

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
Given a directed graph G and a list (s_1, t_1), ..., (s_k, t_k) of terminal pairs, the Directed Steiner Network problem asks for a minimum-cost subgraph of G that contains a directed s_i -> t_i path for every 1 <= i <= k. The special case Directed Steiner Tree (when we ask for paths from a root r to terminals t_1, . . . , t_k) is known to be fixed-parameter tractable parameterized by the number of terminals, while the special case Strongly Connected Steiner Subgraph (when we ask for a path from every t_i to every other t_j ) is known to be W[1]-hard parameterized by the number of terminals. We systematically explore the complexity landscape of directed Steiner problems to fully understand which other special cases are FPT or W[1]-hard. Formally, if H is a class of directed graphs, then we look at the special case of Directed Steiner Network where the list (s_1, t_1), ..., (s_k, t_k) of requests form a directed graph that is a member of H. Our main result is a complete characterization of the classes H resulting in fixed-parameter tractable special cases: we show that if every pattern in H has the combinatorial property of being "transitively equivalent to a bounded-length caterpillar with a bounded number of extra edges," then the problem is FPT, and it is W[1]-hard for every recursively enumerable H not having this property. This complete dichotomy unifies and generalizes the known results showing that Directed Steiner Tree is FPT [Dreyfus and Wagner, Networks 1971], Strongly Connected Steiner Subgraph is W[1]-hard [Guo et al., SIAM J. Discrete Math. 2011], and Directed Steiner Network is solvable in polynomial-time for constant number of terminals [Feldman and Ruhl, SIAM J. Comput. 2006], and moreover reveals a large continent of tractable cases that were not known before.

Cite as

Andreas Emil Feldmann and Dániel Marx. The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 27:1-27:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{feldmann_et_al:LIPIcs.ICALP.2016.27,
  author =	{Feldmann, Andreas Emil and Marx, D\'{a}niel},
  title =	{{The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{27:1--27:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.27},
  URN =		{urn:nbn:de:0030-drops-63060},
  doi =		{10.4230/LIPIcs.ICALP.2016.27},
  annote =	{Keywords: Directed Steiner Tree, Directed Steiner Network, fixed-parameter tractability, dichotomy}
}
Document
Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth

Authors: Dániel Marx and Valia Mitsou

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
Choosability, introduced by Erdös, Rubin, and Taylor [Congr. Number. 1979], is a well-studied concept in graph theory: we say that a graph is c-choosable if for any assignment of a list of c colors to each vertex, there is a proper coloring where each vertex uses a color from its list. We study the complexity of deciding choosability on graphs of bounded treewidth. It follows from earlier work that 3-choosability can be decided in time 2^(2^(O(w)))*n^(O(1)) on graphs of treewidth w. We complement this result by a matching lower bound giving evidence that double-exponential dependence on treewidth may be necessary for the problem: we show that an algorithm with running time 2^(2^(o(w)))*n^(O(1)) would violate the Exponential-Time Hypothesis (ETH). We consider also the optimization problem where the task is to delete the minimum number of vertices to make the graph 4-choosable, and demonstrate that dependence on treewidth becomes tripleexponential for this problem: it can be solved in time 2^(2^(2^(O(w))))*n^(O(1)) on graphs of treewidth w, but an algorithm with running time 2^(2^(2^(o(w))))*n^(O(1)) would violate ETH.

Cite as

Dániel Marx and Valia Mitsou. Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 28:1-28:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2016.28,
  author =	{Marx, D\'{a}niel and Mitsou, Valia},
  title =	{{Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{28:1--28:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.28},
  URN =		{urn:nbn:de:0030-drops-63078},
  doi =		{10.4230/LIPIcs.ICALP.2016.28},
  annote =	{Keywords: Parameterized Complexity, List coloring, Treewidth, Lower bounds under ETH}
}
Document
Routing with Congestion in Acyclic Digraphs

Authors: Saeed Akhoondian Amiri, Stephan Kreutzer, Dániel Marx, and Roman Rabinovich

Published in: LIPIcs, Volume 58, 41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016)


Abstract
We study the version of the k-disjoint paths problem where k demand pairs (s_1,t_1), ..., (s_k,t_k) are specified in the input and the paths in the solution are allowed to intersect, but such that no vertex is on more than c paths. We show that on directed acyclic graphs the problem is solvable in time n^{O(d)} if we allow congestion k-d for k paths. Furthermore, we show that, under a suitable complexity theoretic assumption, the problem cannot be solved in time f(k)n^{o(d*log(d))} for any computable function f.

Cite as

Saeed Akhoondian Amiri, Stephan Kreutzer, Dániel Marx, and Roman Rabinovich. Routing with Congestion in Acyclic Digraphs. In 41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 58, pp. 7:1-7:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{amiri_et_al:LIPIcs.MFCS.2016.7,
  author =	{Amiri, Saeed Akhoondian and Kreutzer, Stephan and Marx, D\'{a}niel and Rabinovich, Roman},
  title =	{{Routing with Congestion in Acyclic Digraphs}},
  booktitle =	{41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016)},
  pages =	{7:1--7:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-016-3},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{58},
  editor =	{Faliszewski, Piotr and Muscholl, Anca and Niedermeier, Rolf},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2016.7},
  URN =		{urn:nbn:de:0030-drops-64244},
  doi =		{10.4230/LIPIcs.MFCS.2016.7},
  annote =	{Keywords: algorithms, disjoint paths, congestion, acyclic digraphs, XP, W\lbrack1\rbrack-hard}
}
Document
Fixed-Parameter Approximability of Boolean MinCSPs

Authors: Édouard Bonnet, László Egri, and Dániel Marx

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
The minimum unsatisfiability version of a constraint satisfaction problem (CSP) asks for an assignment where the number of unsatisfied constraints is minimum possible, or equivalently, asks for a minimum-size set of constraints whose deletion makes the instance satisfiable. For a finite set Gamma of constraints, we denote by CSP(Gamma) the restriction of the problem where each constraint is from Gamma. The polynomial-time solvability and the polynomial-time approximability of CSP(Gamma) were fully characterized by [Khanna et al. SICOMP 2000]. Here we study the fixed-parameter (FP-) approximability of the problem: given an instance and an integer k, one has to find a solution of size at most g(k) in time f(k)n^{O(1)} if a solution of size at most k exists. We especially focus on the case of constant-factor FP-approximability. Our main result classifies each finite constraint language Gamma into one of three classes: (1) CSP(Gamma) has a constant-factor FP-approximation; (2) CSP(Gamma) has a (constant-factor) FP-approximation if and only if Nearest Codeword has a (constant-factor) FP-approximation; (3) CSP(Gamma) has no FP-approximation, unless FPT=W[P]. We show that problems in the second class do not have constant-factor FP-approximations if both the Exponential-Time Hypothesis (ETH) and the Linear PCP Conjecture (LPC) hold. We also show that such an approximation would imply the existence of an FP-approximation for the k-Densest Subgraph problem with ratio 1-epsilon for any epsilon>0.

Cite as

Édouard Bonnet, László Egri, and Dániel Marx. Fixed-Parameter Approximability of Boolean MinCSPs. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 18:1-18:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.ESA.2016.18,
  author =	{Bonnet, \'{E}douard and Egri, L\'{a}szl\'{o} and Marx, D\'{a}niel},
  title =	{{Fixed-Parameter Approximability of Boolean MinCSPs}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{18:1--18:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.18},
  URN =		{urn:nbn:de:0030-drops-63694},
  doi =		{10.4230/LIPIcs.ESA.2016.18},
  annote =	{Keywords: constraint satisfaction problems, approximability, fixed-parameter tractability}
}
Document
Invited Talk
The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems (Invited Talk)

Authors: Dániel Marx

Published in: LIPIcs, Volume 53, 15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016)


Abstract
Given a directed graph G and a list (s_1,t_1), ..., (s_k,t_k) of terminal pairs, the Directed Steiner Network problem asks for a minimum-cost subgraph of G that contains a directed s_i-> t_i path for every 1<= i <= k. Feldman and Ruhl presented an n^{O(k)} time algorithm for the problem, which shows that it is polynomial-time solvable for every fixed number k of demands. There are special cases of the problem that can be solved much more efficiently: for example, the special case Directed Steiner Tree (when we ask for paths from a root r to terminals t_1, ..., t_k) is known to be fixed-parameter tractable parameterized by the number of terminals, that is, algorithms with running time of the form f(k)*n^{O(1)} exist for the problem. On the other hand, the special case Strongly Connected Steiner Subgraph (when we ask for a path from every t_i to every other t_j) is known to be W[1]-hard parameterized by the number of terminals, hence it is unlikely to be fixed-parameter tractable. In the talk, we survey results on parameterized algorithms for special cases of Directed Steiner Network, including a recent complete classification result (joint work with Andreas Feldmann) that systematically explores the complexity landscape of directed Steiner problems to fully understand which special cases are FPT or W[1]-hard.

Cite as

Dániel Marx. The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems (Invited Talk). In 15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 53, p. 32:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.SWAT.2016.32,
  author =	{Marx, D\'{a}niel},
  title =	{{The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems}},
  booktitle =	{15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016)},
  pages =	{32:1--32:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-011-8},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{53},
  editor =	{Pagh, Rasmus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2016.32},
  URN =		{urn:nbn:de:0030-drops-60535},
  doi =		{10.4230/LIPIcs.SWAT.2016.32},
  annote =	{Keywords: Directed Steiner Tree, Directed Steiner Network, fixed-parameter tractability, dichotomy}
}
Document
Peeling and Nibbling the Cactus: Subexponential-Time Algorithms for Counting Triangulations and Related Problems

Authors: Dániel Marx and Tillmann Miltzow

Published in: LIPIcs, Volume 51, 32nd International Symposium on Computational Geometry (SoCG 2016)


Abstract
Given a set of n points S in the plane, a triangulation T of S is a maximal set of non-crossing segments with endpoints in S. We present an algorithm that computes the number of triangulations on a given set of n points in time n^{ (11+ o(1)) sqrt{n} }, significantly improving the previous best running time of O(2^n n^2) by Alvarez and Seidel [SoCG 2013]. Our main tool is identifying separators of size O(sqrt{n}) of a triangulation in a canonical way. The definition of the separators are based on the decomposition of the triangulation into nested layers ("cactus graphs"). Based on the above algorithm, we develop a simple and formal framework to count other non-crossing straight-line graphs in n^{O(sqrt{n})} time. We demonstrate the usefulness of the framework by applying it to counting non-crossing Hamilton cycles, spanning trees, perfect matchings, 3-colorable triangulations, connected graphs, cycle decompositions, quadrangulations, 3-regular graphs, and more.

Cite as

Dániel Marx and Tillmann Miltzow. Peeling and Nibbling the Cactus: Subexponential-Time Algorithms for Counting Triangulations and Related Problems. In 32nd International Symposium on Computational Geometry (SoCG 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 51, pp. 52:1-52:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.SoCG.2016.52,
  author =	{Marx, D\'{a}niel and Miltzow, Tillmann},
  title =	{{Peeling and Nibbling the Cactus:  Subexponential-Time Algorithms for Counting Triangulations and Related Problems}},
  booktitle =	{32nd International Symposium on Computational Geometry (SoCG 2016)},
  pages =	{52:1--52:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-009-5},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{51},
  editor =	{Fekete, S\'{a}ndor and Lubiw, Anna},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2016.52},
  URN =		{urn:nbn:de:0030-drops-59445},
  doi =		{10.4230/LIPIcs.SoCG.2016.52},
  annote =	{Keywords: computational geometry, triangulations, exponential-time algorithms}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301)

Authors: Andrei A. Bulatov, Venkatesan Guruswami, Andrei Krokhin, and Dániel Marx

Published in: Dagstuhl Reports, Volume 5, Issue 7 (2016)


Abstract
During the past two decades, an impressive array of diverse methods from several different mathematical fields, including algebra, logic, mathematical programming, probability theory, graph theory, and combinatorics, have been used to analyze both the computational complexity and approximabilty of algorithmic tasks related to the constraint satisfaction problem (CSP), as well as the applicability/limitations of algorithmic techniques. This research direction develops at an impressive speed, regularly producing very strong and general results. The Dagstuhl Seminar 15301 "The Constraint Satisfaction Problem: Complexity and Approximability" was aimed at bringing together researchers using all the different techniques in the study of the CSP, so that they can share their insights obtained during the past three years. This report documents the material presented during the course of the seminar.

Cite as

Andrei A. Bulatov, Venkatesan Guruswami, Andrei Krokhin, and Dániel Marx. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301). In Dagstuhl Reports, Volume 5, Issue 7, pp. 22-41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{bulatov_et_al:DagRep.5.7.22,
  author =	{Bulatov, Andrei A. and Guruswami, Venkatesan and Krokhin, Andrei and Marx, D\'{a}niel},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301)}},
  pages =	{22--41},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2016},
  volume =	{5},
  number =	{7},
  editor =	{Bulatov, Andrei A. and Guruswami, Venkatesan and Krokhin, Andrei and Marx, D\'{a}niel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.5.7.22},
  URN =		{urn:nbn:de:0030-drops-56714},
  doi =		{10.4230/DagRep.5.7.22},
  annote =	{Keywords: Constraint satisfaction problem (CSP), Computational complexity, CSP dichotomy conjecture, Hardness of approximation, Unique games conjecture, Fixed-parameter tractability, Descriptive complexity, Universal algebra, Logic, Decomposition methods}
}
Document
Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451)

Authors: Stefan Kratsch, Daniel Lokshtanov, Dániel Marx, and Peter Rossmanith

Published in: Dagstuhl Reports, Volume 4, Issue 11 (2015)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 14451 "Optimality and tight results in parameterized complexity". Over the last two decades parameterized complexity has become one of the main tools for handling intractable problems. Recently, tools have been developed not only to classify problems, but also to make statements about how close an algorithm is to being optimal with respect to running time. The focus of this seminar is to highlight and discuss recent, relevant results within this optimality framework and discover fruitful research directions. The report contains the abstracts of the results presented at the seminar, as well as a collection of open problems stated at the seminar.

Cite as

Stefan Kratsch, Daniel Lokshtanov, Dániel Marx, and Peter Rossmanith. Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451). In Dagstuhl Reports, Volume 4, Issue 11, pp. 1-21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@Article{kratsch_et_al:DagRep.4.11.1,
  author =	{Kratsch, Stefan and Lokshtanov, Daniel and Marx, D\'{a}niel and Rossmanith, Peter},
  title =	{{Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451)}},
  pages =	{1--21},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2015},
  volume =	{4},
  number =	{11},
  editor =	{Kratsch, Stefan and Lokshtanov, Daniel and Marx, D\'{a}niel and Rossmanith, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.4.11.1},
  URN =		{urn:nbn:de:0030-drops-49677},
  doi =		{10.4230/DagRep.4.11.1},
  annote =	{Keywords: Algorithms, parameterized complexity, kernels, width measures, exponential time hypothesis, lower bounds}
}
Document
Chordal Editing is Fixed-Parameter Tractable

Authors: Yixin Cao and Dániel Marx

Published in: LIPIcs, Volume 25, 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)


Abstract
Graph modification problems are typically asked as follows: is there a set of k operations that transforms a given graph to have a certain property. The most commonly considered operations include vertex deletion, edge deletion, and edge addition; for the same property, one can define significantly different versions by allowing different operations. We study a very general graph modification problem which allows all three types of operations: given a graph G and integers k_1, k_2, and k_3, the CHORDAL EDITING problem asks if G can be transformed into a chordal graph by at most k_1 vertex deletions, k_2 edge deletions, and k_3 edge additions. Clearly, this problem generalizes both CHORDAL VERTEX/EDGE DELETION and CHORDAL COMPLETION (also known as MINIMUM FILL-IN). Our main result is an algorithm for CHORDAL EDITING in time 2^O(k.log(k))·n^O(1), where k:=k_1+k_2+k_3; therefore, the problem is fixed-parameter tractable parameterized by the total number of allowed operations. Our algorithm is both more efficient and conceptually simpler than the previously known algorithm for the special case CHORDAL DELETION.

Cite as

Yixin Cao and Dániel Marx. Chordal Editing is Fixed-Parameter Tractable. In 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 25, pp. 214-225, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{cao_et_al:LIPIcs.STACS.2014.214,
  author =	{Cao, Yixin and Marx, D\'{a}niel},
  title =	{{Chordal Editing is Fixed-Parameter Tractable}},
  booktitle =	{31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)},
  pages =	{214--225},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-65-1},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{25},
  editor =	{Mayr, Ernst W. and Portier, Natacha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2014.214},
  URN =		{urn:nbn:de:0030-drops-44591},
  doi =		{10.4230/LIPIcs.STACS.2014.214},
  annote =	{Keywords: chordal graph, parameterized computation, graph modification problems, chordal deletion, chordal completion, clique tree decomposition, holes, simplic}
}
Document
Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask)

Authors: Dániel Marx and Michal Pilipczuk

Published in: LIPIcs, Volume 25, 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)


Abstract
Given two graphs H and G, the Subgraph Isomorphism problem asks if H is isomorphic to a subgraph of G. While NP-hard in general, algorithms exist for various parameterized versions of the problem. However, the literature contains very little guidance on which combinations of parameters can or cannot be exploited algorithmically. Our goal is to systematically investigate the possible parameterized algorithms that can exist for Subgraph Isomorphism. We develop a framework involving 10 relevant parameters for each of H and G (such as treewidth, pathwidth, genus, maximum degree, number of vertices, number of components, etc.), and ask if an algorithm with running time f1_(p_1,p_2,...,p_l).n^f_2(p_(l+1),...,p_k) exists, where each of p_1,...,p_k is one of the 10 parameters depending only on H or G. We show that all the questions arising in this framework are answered by a set of 11 maximal positive results (algorithms) and a set of 17 maximal negative results (hardness proofs); some of these results already appear in the literature, while others are new in this paper. On the algorithmic side, our study reveals for example that an unexpected combination of bounded degree, genus, and feedback vertex set number of G gives rise to a highly nontrivial algorithm for Subgraph Isomorphism. On the hardness side, we present W[1]-hardness proofs under extremely restricted conditions, such as when H is a bounded-degree tree of constant pathwidth and G is a planar graph of bounded pathwidth.

Cite as

Dániel Marx and Michal Pilipczuk. Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask). In 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 25, pp. 542-553, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.STACS.2014.542,
  author =	{Marx, D\'{a}niel and Pilipczuk, Michal},
  title =	{{Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask)}},
  booktitle =	{31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)},
  pages =	{542--553},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-65-1},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{25},
  editor =	{Mayr, Ernst W. and Portier, Natacha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2014.542},
  URN =		{urn:nbn:de:0030-drops-44863},
  doi =		{10.4230/LIPIcs.STACS.2014.542},
  annote =	{Keywords: parameterized complexity, subgraph isomorphism}
}
Document
Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421)

Authors: Glencora Borradaile, Philp Klein, Dániel Marx, and Claire Mathieu

Published in: Dagstuhl Reports, Volume 3, Issue 10 (2014)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 13421 "Algorithms for Optimization Problems in Planar Graphs". The seminar was held from October 13 to October 18, 2013. This report contains abstracts for the recent developments in planar graph algorithms discussed during the seminar as well as summaries of open problems in this area of research.

Cite as

Glencora Borradaile, Philp Klein, Dániel Marx, and Claire Mathieu. Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421). In Dagstuhl Reports, Volume 3, Issue 10, pp. 36-57, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@Article{borradaile_et_al:DagRep.3.10.36,
  author =	{Borradaile, Glencora and Klein, Philp and Marx, D\'{a}niel and Mathieu, Claire},
  title =	{{Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421)}},
  pages =	{36--57},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2014},
  volume =	{3},
  number =	{10},
  editor =	{Borradaile, Glencora and Klein, Philp and Marx, D\'{a}niel and Mathieu, Claire},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.3.10.36},
  URN =		{urn:nbn:de:0030-drops-44274},
  doi =		{10.4230/DagRep.3.10.36},
  annote =	{Keywords: Algorithms, planar graphs, theory, approximation, fixed-parameter tractable, network flow, network design, kernelization}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451)

Authors: Johan Hastad, Andrei Krokhin, and Dániel Marx

Published in: Dagstuhl Reports, Volume 2, Issue 11 (2013)


Abstract
During the past two decades, an impressive array of diverse methods from several different mathematical fields, including algebra, logic, analysis, probability theory, graph theory, and combinatorics, have been used to analyze both the computational complexity and approximabilty of algorithmic tasks related to the constraint satisfaction problem (CSP), as well as the applicability/limitations of algorithmic techniques. The Dagstuhl Seminar 12451 ``The Constraint Satisfaction Problem: Complexity and Approximability'' was aimed at bringing together researchers using all the different techniques in the study of the CSP, so that they can share their insights. This report documents the material presented during the course of the seminar.

Cite as

Johan Hastad, Andrei Krokhin, and Dániel Marx. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451). In Dagstuhl Reports, Volume 2, Issue 11, pp. 1-19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Article{hastad_et_al:DagRep.2.11.1,
  author =	{Hastad, Johan and Krokhin, Andrei and Marx, D\'{a}niel},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451)}},
  pages =	{1--19},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2013},
  volume =	{2},
  number =	{11},
  editor =	{Hastad, Johan and Krokhin, Andrei and Marx, D\'{a}niel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.2.11.1},
  URN =		{urn:nbn:de:0030-drops-39764},
  doi =		{10.4230/DagRep.2.11.1},
  annote =	{Keywords: Constraint satisfaction problem (CSP); Computational complexity; CSP dichotomy conjecture; Hardness of approximation; Unique games conjceture; Fixed-parameter tractability; Descriptive complexity; niversal algebra; Logic; Decomposition methods}
}
Document
Tutorial
Algorithmic Graph Structure Theory (Tutorial)

Authors: Dániel Marx

Published in: LIPIcs, Volume 20, 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)


Abstract
The Graph Minors project of Robertson and Seymour uncovered a very deep structural theory of graphs. This theory had several important consequences, among others, the proof of Wagner's Conjecture. While the whole theory, presented in a series of 23 very dense papers, is notoriously difficult to understand, it has to be emphasized that these papers introduced several elementary concepts and tools that had strong impact on algorithms, complexity, and combinatorics. Moreover, even some of the very deep results can be stated in a compact and useful way, and it is possible to build upon these results without a complete understanding of the underlying machinery. In the first part of the lecture, I will introduce the concept of treewidth, which can be thought of as an elementary entry point to graph minors theory. I will overview its graph-theoretic and algorithmic properties that make it especially important in the design of parameterized algorithms and approximation schemes on planar graphs. Furthermore, I will briefly explain some of the connections of treewidth to complexity and automata theory. In the next part of the lecture, we will turn our attention to the more advanced topic of graphs excluding a fixed minor: the structure of such graphs, finding minors, and the well-quasi-ordering of the minor relation. The primary goal here is to provide clear and useful statements of these results and to show how they generalize the concepts of treewidth and planar graphs. Finally, I will briefly overview some more recent results involving different kinds of excluded structures, such as graphs excluding odd minors and topological minors.

Cite as

Dániel Marx. Algorithmic Graph Structure Theory (Tutorial). In 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 20, p. 7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.STACS.2013.7,
  author =	{Marx, D\'{a}niel},
  title =	{{Algorithmic Graph Structure Theory}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{7--7},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Portier, Natacha and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2013.7},
  URN =		{urn:nbn:de:0030-drops-39175},
  doi =		{10.4230/LIPIcs.STACS.2013.7},
  annote =	{Keywords: Graph theory, graph minors, structure theorems}
}
Document
Data Reduction and Problem Kernels (Dagstuhl Seminar 12241)

Authors: Michael R. Fellows, Jiong Guo, Dániel Marx, and Saket Saurabh

Published in: Dagstuhl Reports, Volume 2, Issue 6 (2012)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 12241 ``Data Reduction and Problem Kernels''. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Michael R. Fellows, Jiong Guo, Dániel Marx, and Saket Saurabh. Data Reduction and Problem Kernels (Dagstuhl Seminar 12241). In Dagstuhl Reports, Volume 2, Issue 6, pp. 26-50, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@Article{fellows_et_al:DagRep.2.6.26,
  author =	{Fellows, Michael R. and Guo, Jiong and Marx, D\'{a}niel and Saurabh, Saket},
  title =	{{Data Reduction and Problem Kernels (Dagstuhl Seminar 12241)}},
  pages =	{26--50},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2012},
  volume =	{2},
  number =	{6},
  editor =	{Fellows, Michael R. and Guo, Jiong and Marx, D\'{a}niel and Saurabh, Saket},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.2.6.26},
  URN =		{urn:nbn:de:0030-drops-37297},
  doi =		{10.4230/DagRep.2.6.26},
  annote =	{Keywords: Preprocessing, Fixed-parameter tractability, Parameterized algorithmics}
}
Document
Treewidth Reduction for Constrained Separation and Bipartization Problems

Authors: Dániel Marx, Barry O'Sullivan, and Igor Razgon

Published in: LIPIcs, Volume 5, 27th International Symposium on Theoretical Aspects of Computer Science (2010)


Abstract
We present a method for reducing the treewidth of a graph while preserving all the minimal $s-t$ separators. This technique turns out to be very useful for establishing the fixed-parameter tractability of constrained separation and bipartization problems. To demonstrate the power of this technique, we prove the fixed-parameter tractability of a number of well-known separation and bipartization problems with various additional restrictions (e.g., the vertices being removed from the graph form an independent set). These results answer a number of open questions in the area of parameterized complexity.

Cite as

Dániel Marx, Barry O'Sullivan, and Igor Razgon. Treewidth Reduction for Constrained Separation and Bipartization Problems. In 27th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 5, pp. 561-572, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.STACS.2010.2485,
  author =	{Marx, D\'{a}niel and O'Sullivan, Barry and Razgon, Igor},
  title =	{{Treewidth Reduction for Constrained Separation and Bipartization Problems}},
  booktitle =	{27th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{561--572},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-16-3},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{5},
  editor =	{Marion, Jean-Yves and Schwentick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2010.2485},
  URN =		{urn:nbn:de:0030-drops-24850},
  doi =		{10.4230/LIPIcs.STACS.2010.2485},
  annote =	{Keywords: Fixed-parameter algorithms, graph separation problems, treewidth}
}
Document
09511 Abstracts Collection – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
From 14. 12. 2009 to 17. 12. 2009., the Dagstuhl Seminar 09511 ``Parameterized complexity and approximation algorithms '' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Abstracts Collection – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.1,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Abstracts Collection – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.1},
  URN =		{urn:nbn:de:0030-drops-25025},
  doi =		{10.4230/DagSemProc.09511.1},
  annote =	{Keywords: Parameterized complexity, Approximation algorithms}
}
Document
09511 Executive Summary – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
Many of the computational problems that arise in practice are optimization problems: the task is to find a solution where the cost, quality, size, profit, or some other measure is as large or small as possible. The NP-hardness of an optimization problem implies that, unless P = NP, there is no polynomial-time algorithm that finds the exact value of the optimum. Various approaches have been proposed in the literature to cope with NP-hard problems. When designing approximation algorithms, we relax the requirement that the algorithm produces an optimum solution, and our aim is to devise a polynomial-time algorithm such that the solution it produces is not necessarily optimal, but there is some worst-case bound on the solution quality.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Executive Summary – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.2,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Executive Summary – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.2},
  URN =		{urn:nbn:de:0030-drops-25011},
  doi =		{10.4230/DagSemProc.09511.2},
  annote =	{Keywords: Parameterized complexity, Approximation algorithms}
}
Document
09511 Open Problems – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
The paper contains a list of the problems presented on Monday, December 14, 2009 at the open problem session of the Seminar on Parameterized Complexity and Approximation Algorithms, held at Schloss Dagstuhl in Wadern, Germany.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Open Problems – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.3,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Open Problems – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  pages =	{1--10},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.3},
  URN =		{urn:nbn:de:0030-drops-24992},
  doi =		{10.4230/DagSemProc.09511.3},
  annote =	{Keywords: Parameterized complexity, approximation algorithms, open problems}
}
Document
Enumerating Homomorphisms

Authors: Andrei A. Bulatov, Victor Dalmau, Martin Grohe, and Daniel Marx

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The homomorphism problem for relational structures is an abstract way of formulating constraint satisfaction problems (CSP) and various problems in database theory. The decision version of the homomorphism problem received a lot of attention in literature; in particular, the way the graph-theoretical structure of the variables and constraints influences the complexity of the problem is intensively studied. Here we study the problem of enumerating all the solutions with polynomial delay from a similar point of view. It turns out that the enumeration problem behaves very differently from the decision version. We give evidence that it is unlikely that a characterization result similar to the decision version can be obtained. Nevertheless, we show nontrivial cases where enumeration can be done with polynomial delay.

Cite as

Andrei A. Bulatov, Victor Dalmau, Martin Grohe, and Daniel Marx. Enumerating Homomorphisms. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 231-242, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{bulatov_et_al:LIPIcs.STACS.2009.1838,
  author =	{Bulatov, Andrei A. and Dalmau, Victor and Grohe, Martin and Marx, Daniel},
  title =	{{Enumerating Homomorphisms}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{231--242},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1838},
  URN =		{urn:nbn:de:0030-drops-18385},
  doi =		{10.4230/LIPIcs.STACS.2009.1838},
  annote =	{Keywords: }
}
Document
Tractable Structures for Constraint Satisfaction with Truth Tables

Authors: Daniel Marx

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The way the graph structure of the constraints influences the complexity of constraint satisfaction problems (CSP) is well understood for bounded-arity constraints. The situation is less clear if there is no bound on the arities. In this case the answer depends also on how the constraints are represented in the input. We study this question for the truth table representation of constraints. We introduce a new hypergraph measure {\em adaptive width} and show that CSP with truth tables is polynomial-time solvable if restricted to a class of hypergraphs with bounded adaptive width. Conversely, assuming a conjecture on the complexity of binary CSP, there is no other polynomial-time solvable case.

Cite as

Daniel Marx. Tractable Structures for Constraint Satisfaction with Truth Tables. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 649-660, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.STACS.2009.1807,
  author =	{Marx, Daniel},
  title =	{{Tractable Structures for Constraint Satisfaction with Truth Tables}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{649--660},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1807},
  URN =		{urn:nbn:de:0030-drops-18079},
  doi =		{10.4230/LIPIcs.STACS.2009.1807},
  annote =	{Keywords: Computational complexity, Constraint satisfaction, Treewidth, Adaptive width}
}
Document
07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs

Authors: Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege

Published in: Dagstuhl Seminar Proceedings, Volume 7281, Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs (2007)


Abstract
The following is a list of the problems presented on Monday, July 9, 2007 at the open-problem session of the Seminar on Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs, held at Schloss Dagstuhl in Wadern, Germany.

Cite as

Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege. 07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs. In Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. Dagstuhl Seminar Proceedings, Volume 7281, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.07281.2,
  author =	{Demaine, Erik and Gutin, Gregory Z. and Marx, Daniel and Stege, Ulrike},
  title =	{{07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs}},
  booktitle =	{Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7281},
  editor =	{Erik Demaine and Gregory Z. Gutin and Daniel Marx and Ulrike Stege},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07281.2},
  URN =		{urn:nbn:de:0030-drops-12542},
  doi =		{10.4230/DagSemProc.07281.2},
  annote =	{Keywords: }
}
Document
07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs

Authors: Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege

Published in: Dagstuhl Seminar Proceedings, Volume 7281, Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs (2007)


Abstract
From 8th to 13th July 2007, the Dagstuhl Seminar ``Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege. 07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. In Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. Dagstuhl Seminar Proceedings, Volume 7281, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.07281.1,
  author =	{Demaine, Erik and Gutin, Gregory Z. and Marx, Daniel and Stege, Ulrike},
  title =	{{07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs}},
  booktitle =	{Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7281},
  editor =	{Erik Demaine and Gregory Z. Gutin and Daniel Marx and Ulrike Stege},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07281.1},
  URN =		{urn:nbn:de:0030-drops-12450},
  doi =		{10.4230/DagSemProc.07281.1},
  annote =	{Keywords: Parameterized complexity, fixed-parameter tractability, graph structure theory}
}

Marx, Daniel

Document
Approximate Monotone Local Search for Weighted Problems

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 285, 18th International Symposium on Parameterized and Exact Computation (IPEC 2023)


Abstract
In a recent work, Esmer et al. describe a simple method - Approximate Monotone Local Search - to obtain exponential approximation algorithms from existing parameterized exact algorithms, polynomial-time approximation algorithms and, more generally, parameterized approximation algorithms. In this work, we generalize those results to the weighted setting. More formally, we consider monotone subset minimization problems over a weighted universe of size n (e.g., Vertex Cover, d-Hitting Set and Feedback Vertex Set). We consider a model where the algorithm is only given access to a subroutine that finds a solution of weight at most α ⋅ W (and of arbitrary cardinality) in time c^k ⋅ n^{𝒪(1)} where W is the minimum weight of a solution of cardinality at most k. In the unweighted setting, Esmer et al. determine the smallest value d for which a β-approximation algorithm running in time dⁿ ⋅ n^{𝒪(1)} can be obtained in this model. We show that the same dependencies also hold in a weighted setting in this model: for every fixed ε > 0 we obtain a β-approximation algorithm running in time 𝒪((d+ε)ⁿ), for the same d as in the unweighted setting. Similarly, we also extend a β-approximate brute-force search (in a model which only provides access to a membership oracle) to the weighted setting. Using existing approximation algorithms and exact parameterized algorithms for weighted problems, we obtain the first exponential-time β-approximation algorithms that are better than brute force for a variety of problems including Weighted Vertex Cover, Weighted d-Hitting Set, Weighted Feedback Vertex Set and Weighted Multicut.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Approximate Monotone Local Search for Weighted Problems. In 18th International Symposium on Parameterized and Exact Computation (IPEC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 285, pp. 17:1-17:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2023.17,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Approximate Monotone Local Search for Weighted Problems}},
  booktitle =	{18th International Symposium on Parameterized and Exact Computation (IPEC 2023)},
  pages =	{17:1--17:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-305-8},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{285},
  editor =	{Misra, Neeldhara and Wahlstr\"{o}m, Magnus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2023.17},
  URN =		{urn:nbn:de:0030-drops-194360},
  doi =		{10.4230/LIPIcs.IPEC.2023.17},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
Computing Generalized Convolutions Faster Than Brute Force

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In this paper, we consider a general notion of convolution. Let D be a finite domain and let Dⁿ be the set of n-length vectors (tuples) of D. Let f : D × D → D be a function and let ⊕_f be a coordinate-wise application of f. The f-Convolution of two functions g,h : Dⁿ → {-M,…,M} is (g ⊛_f h)(v) := ∑_{v_g,v_h ∈ D^n s.t. v = v_g ⊕_f v_h} g(v_g) ⋅ h(v_h) for every 𝐯 ∈ Dⁿ. This problem generalizes many fundamental convolutions such as Subset Convolution, XOR Product, Covering Product or Packing Product, etc. For arbitrary function f and domain D we can compute f-Convolution via brute-force enumeration in 𝒪̃(|D|^{2n} ⋅ polylog(M)) time. Our main result is an improvement over this naive algorithm. We show that f-Convolution can be computed exactly in 𝒪̃((c ⋅ |D|²)ⁿ ⋅ polylog(M)) for constant c := 5/6 when D has even cardinality. Our main observation is that a cyclic partition of a function f : D × D → D can be used to speed up the computation of f-Convolution, and we show that an appropriate cyclic partition exists for every f. Furthermore, we demonstrate that a single entry of the f-Convolution can be computed more efficiently. In this variant, we are given two functions g,h : Dⁿ → {-M,…,M} alongside with a vector 𝐯 ∈ Dⁿ and the task of the f-Query problem is to compute integer (g ⊛_f h)(𝐯). This is a generalization of the well-known Orthogonal Vectors problem. We show that f-Query can be computed in 𝒪̃(|D|^{(ω/2)n} ⋅ polylog(M)) time, where ω ∈ [2,2.373) is the exponent of currently fastest matrix multiplication algorithm.

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Philipp Schepper, and Karol Węgrzycki. Computing Generalized Convolutions Faster Than Brute Force. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 12:1-12:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.IPEC.2022.12,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Schepper, Philipp and W\k{e}grzycki, Karol},
  title =	{{Computing Generalized Convolutions Faster Than Brute Force}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{12:1--12:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.12},
  URN =		{urn:nbn:de:0030-drops-173685},
  doi =		{10.4230/LIPIcs.IPEC.2022.12},
  annote =	{Keywords: Generalized Convolution, Fast Fourier Transform, Fast Subset Convolution}
}
Document
Domination and Cut Problems on Chordal Graphs with Bounded Leafage

Authors: Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
The leafage of a chordal graph G is the minimum integer 𝓁 such that G can be realized as an intersection graph of subtrees of a tree with 𝓁 leaves. We consider structural parameterization by the leafage of classical domination and cut problems on chordal graphs. Fomin, Golovach, and Raymond [ESA 2018, Algorithmica 2020] proved, among other things, that Dominating Set on chordal graphs admits an algorithm running in time 2^𝒪(𝓁²) ⋅ n^𝒪(1). We present a conceptually much simpler algorithm that runs in time 2^𝒪(𝓁) ⋅ n^𝒪(1). We extend our approach to obtain similar results for Connected Dominating Set and Steiner Tree. We then consider the two classical cut problems MultiCut with Undeletable Terminals and Multiway Cut with Undeletable Terminals. We prove that the former is W[1]-hard when parameterized by the leafage and complement this result by presenting a simple n^𝒪(𝓁)-time algorithm. To our surprise, we find that Multiway Cut with Undeletable Terminals on chordal graphs can be solved, in contrast, in n^O(1)-time.

Cite as

Esther Galby, Dániel Marx, Philipp Schepper, Roohani Sharma, and Prafullkumar Tale. Domination and Cut Problems on Chordal Graphs with Bounded Leafage. In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 14:1-14:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{galby_et_al:LIPIcs.IPEC.2022.14,
  author =	{Galby, Esther and Marx, D\'{a}niel and Schepper, Philipp and Sharma, Roohani and Tale, Prafullkumar},
  title =	{{Domination and Cut Problems on Chordal Graphs with Bounded Leafage}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{14:1--14:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.14},
  URN =		{urn:nbn:de:0030-drops-173704},
  doi =		{10.4230/LIPIcs.IPEC.2022.14},
  annote =	{Keywords: Chordal Graphs, Leafage, FPT Algorithms, Dominating Set, MultiCut with Undeletable Terminals, Multiway Cut with Undeletable Terminals}
}
Document
Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 249, 17th International Symposium on Parameterized and Exact Computation (IPEC 2022)


Abstract
In the general AntiFactor problem, a graph G and, for every vertex v of G, a set X_v ⊆ ℕ of forbidden degrees is given. The task is to find a set S of edges such that the degree of v in S is not in the set X_v. Standard techniques (dynamic programming plus fast convolution) can be used to show that if M is the largest forbidden degree, then the problem can be solved in time (M+2)^{tw}⋅n^{O(1)} if a tree decomposition of width tw is given. However, significantly faster algorithms are possible if the sets X_v are sparse: our main algorithmic result shows that if every vertex has at most x forbidden degrees (we call this special case AntiFactor_x), then the problem can be solved in time (x+1)^{O(tw)}⋅n^{O(1)}. That is, AntiFactor_x is fixed-parameter tractable parameterized by treewidth tw and the maximum number x of excluded degrees. Our algorithm uses the technique of representative sets, which can be generalized to the optimization version, but (as expected) not to the counting version of the problem. In fact, we show that #AntiFactor₁ is already #W[1]-hard parameterized by the width of the given decomposition. Moreover, we show that, unlike for the decision version, the standard dynamic programming algorithm is essentially optimal for the counting version. Formally, for a fixed nonempty set X, we denote by X-AntiFactor the special case where every vertex v has the same set X_v = X of forbidden degrees. We show the following lower bound for every fixed set X: if there is an ε > 0 such that #X-AntiFactor can be solved in time (max X+2-ε)^{tw}⋅n^{O(1)} given a tree decomposition of width tw, then the Counting Strong Exponential-Time Hypothesis (#SETH) fails.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard). In 17th International Symposium on Parameterized and Exact Computation (IPEC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 249, pp. 22:1-22:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.IPEC.2022.22,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Anti-Factor Is FPT Parameterized by Treewidth and List Size (But Counting Is Hard)}},
  booktitle =	{17th International Symposium on Parameterized and Exact Computation (IPEC 2022)},
  pages =	{22:1--22:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-260-0},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{249},
  editor =	{Dell, Holger and Nederlof, Jesper},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2022.22},
  URN =		{urn:nbn:de:0030-drops-173780},
  doi =		{10.4230/LIPIcs.IPEC.2022.22},
  annote =	{Keywords: Anti-Factor, General Factor, Treewidth, Representative Sets, SETH}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)

Authors: Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný

Published in: Dagstuhl Reports, Volume 12, Issue 5 (2022)


Abstract
Constraint satisfaction has always played a central role in computational complexity theory; appropriate versions of CSPs are classical complete problems for most standard complexity classes. CSPs constitute a very rich and yet sufficiently manageable class of problems to give a good perspective on general computational phenomena. For instance, they help to understand which mathematical properties make a computational problem tractable (in a wide sense, e.g., polynomial-time solvable, non-trivially approximable, fixed-parameter tractable, or definable in a weak logic). In the last 15 years, research activity in this area has significantly intensified and hugely impressive progress was made. The Dagstuhl Seminar 22201 "The Constraint Satisfaction Problem: Complexity and Approximability" was aimed at bringing together researchers using all the different techniques in the study of the CSP so that they can share their insights obtained during the past four years. This report documents the material presented during the course of the seminar.

Cite as

Martin Grohe, Venkatesan Guruswami, Dániel Marx, and Stanislav Živný. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201). In Dagstuhl Reports, Volume 12, Issue 5, pp. 112-130, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@Article{grohe_et_al:DagRep.12.5.112,
  author =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 22201)}},
  pages =	{112--130},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2022},
  volume =	{12},
  number =	{5},
  editor =	{Grohe, Martin and Guruswami, Venkatesan and Marx, D\'{a}niel and \v{Z}ivn\'{y}, Stanislav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.12.5.112},
  URN =		{urn:nbn:de:0030-drops-174453},
  doi =		{10.4230/DagRep.12.5.112},
  annote =	{Keywords: Constraint satisfaction problem (CSP); Computational complexity; Hardness of approximation; Universal algebra; Semidefinite programming}
}
Document
Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search

Authors: Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma

Published in: LIPIcs, Volume 244, 30th Annual European Symposium on Algorithms (ESA 2022)


Abstract
We generalize the monotone local search approach of Fomin, Gaspers, Lokshtanov and Saurabh [J.ACM 2019], by establishing a connection between parameterized approximation and exponential-time approximation algorithms for monotone subset minimization problems. In a monotone subset minimization problem the input implicitly describes a non-empty set family over a universe of size n which is closed under taking supersets. The task is to find a minimum cardinality set in this family. Broadly speaking, we use approximate monotone local search to show that a parameterized α-approximation algorithm that runs in c^k⋅n^𝒪(1) time, where k is the solution size, can be used to derive an α-approximation randomized algorithm that runs in dⁿ⋅n^𝒪(1) time, where d is the unique value in (1, 1+{c-1}/α) such that 𝒟(1/α‖{d-1}/{c-1}) = {ln c}/α and 𝒟(a‖b) is the Kullback-Leibler divergence. This running time matches that of Fomin et al. for α = 1, and is strictly better when α > 1, for any c > 1. Furthermore, we also show that this result can be derandomized at the expense of a sub-exponential multiplicative factor in the running time. We use an approximate variant of the exhaustive search as a benchmark for our algorithm. We show that the classic 2ⁿ⋅n^𝒪(1) exhaustive search can be adapted to an α-approximate exhaustive search that runs in time (1+exp(-α⋅ℋ(1/(α))))ⁿ⋅n^𝒪(1), where ℋ is the entropy function. Furthermore, we provide a lower bound stating that the running time of this α-approximate exhaustive search is the best achievable running time in an oracle model. When compared to approximate exhaustive search, and to other techniques, the running times obtained by approximate monotone local search are strictly better for any α ≥ 1, c > 1. We demonstrate the potential of approximate monotone local search by deriving new and faster exponential approximation algorithms for Vertex Cover, 3-Hitting Set, Directed Feedback Vertex Set, Directed Subset Feedback Vertex Set, Directed Odd Cycle Transversal and Undirected Multicut. For instance, we get a 1.1-approximation algorithm for Vertex Cover with running time 1.114ⁿ⋅n^𝒪(1), improving upon the previously best known 1.1-approximation running in time 1.127ⁿ⋅n^𝒪(1) by Bourgeois et al. [DAM 2011].

Cite as

Barış Can Esmer, Ariel Kulik, Dániel Marx, Daniel Neuen, and Roohani Sharma. Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search. In 30th Annual European Symposium on Algorithms (ESA 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 244, pp. 50:1-50:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{esmer_et_al:LIPIcs.ESA.2022.50,
  author =	{Esmer, Bar{\i}\c{s} Can and Kulik, Ariel and Marx, D\'{a}niel and Neuen, Daniel and Sharma, Roohani},
  title =	{{Faster Exponential-Time Approximation Algorithms Using Approximate Monotone Local Search}},
  booktitle =	{30th Annual European Symposium on Algorithms (ESA 2022)},
  pages =	{50:1--50:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-247-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{244},
  editor =	{Chechik, Shiri and Navarro, Gonzalo and Rotenberg, Eva and Herman, Grzegorz},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2022.50},
  URN =		{urn:nbn:de:0030-drops-169887},
  doi =		{10.4230/LIPIcs.ESA.2022.50},
  annote =	{Keywords: parameterized approximations, exponential approximations, monotone local search}
}
Document
Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves

Authors: Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser

Published in: LIPIcs, Volume 224, 38th International Symposium on Computational Geometry (SoCG 2022)


Abstract
The Dynamic Time Warping (DTW) distance is a popular measure of similarity for a variety of sequence data. For comparing polygonal curves π, σ in ℝ^d, it provides a robust, outlier-insensitive alternative to the Fréchet distance. However, like the Fréchet distance, the DTW distance is not invariant under translations. Can we efficiently optimize the DTW distance of π and σ under arbitrary translations, to compare the curves' shape irrespective of their absolute location? There are surprisingly few works in this direction, which may be due to its computational intricacy: For the Euclidean norm, this problem contains as a special case the geometric median problem, which provably admits no exact algebraic algorithm (that is, no algorithm using only addition, multiplication, and k-th roots). We thus investigate exact algorithms for non-Euclidean norms as well as approximation algorithms for the Euclidean norm. For the L₁ norm in ℝ^d, we provide an 𝒪(n^{2(d+1)})-time algorithm, i.e., an exact polynomial-time algorithm for constant d. Here and below, n bounds the curves' complexities. For the Euclidean norm in ℝ², we show that a simple problem-specific insight leads to a (1+ε)-approximation in time 𝒪(n³/ε²). We then show how to obtain a subcubic 𝒪̃(n^{2.5}/ε²) time algorithm with significant new ideas; this time comes close to the well-known quadratic time barrier for computing DTW for fixed translations. Technically, the algorithm is obtained by speeding up repeated DTW distance estimations using a dynamic data structure for maintaining shortest paths in weighted planar digraphs. Crucially, we show how to traverse a candidate set of translations using space-filling curves in a way that incurs only few updates to the data structure. We hope that our results will facilitate the use of DTW under translation both in theory and practice, and inspire similar algorithmic approaches for related geometric optimization problems.

Cite as

Karl Bringmann, Sándor Kisfaludi‑Bak, Marvin Künnemann, Dániel Marx, and André Nusser. Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves. In 38th International Symposium on Computational Geometry (SoCG 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 224, pp. 20:1-20:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{bringmann_et_al:LIPIcs.SoCG.2022.20,
  author =	{Bringmann, Karl and Kisfaludi‑Bak, S\'{a}ndor and K\"{u}nnemann, Marvin and Marx, D\'{a}niel and Nusser, Andr\'{e}},
  title =	{{Dynamic Time Warping Under Translation: Approximation Guided by Space-Filling Curves}},
  booktitle =	{38th International Symposium on Computational Geometry (SoCG 2022)},
  pages =	{20:1--20:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-227-3},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{224},
  editor =	{Goaoc, Xavier and Kerber, Michael},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2022.20},
  URN =		{urn:nbn:de:0030-drops-160287},
  doi =		{10.4230/LIPIcs.SoCG.2022.20},
  annote =	{Keywords: Dynamic Time Warping, Sequence Similarity Measures}
}
Document
Track A: Algorithms, Complexity and Games
Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth

Authors: Dániel Marx, Govind S. Sankar, and Philipp Schepper

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
In the General Factor problem, we are given an undirected graph G and for each vertex v ∈ V(G) a finite set B_v of non-negative integers. The task is to decide if there is a subset S ⊆ E(G) such that deg_S(v) ∈ B_v for all vertices v of G. Define the max-gap of a finite integer set B to be the largest d ≥ 0 such that there is an a ≥ 0 with [a,a+d+1] ∩ B = {a,a+d+1}. Cornuéjols showed in 1988 that if the max-gap of all sets B_v is at most 1, then the decision version of General Factor is polynomial-time solvable. This result was extended 2018 by Dudycz and Paluch for the optimization (i.e. minimization and maximization) versions. We present a general algorithm counting the number of solutions of a certain size in time #2 (M+1)^{tw}^{𝒪(1)}, given a tree decomposition of width tw, where M is the maximum integer over all B_v. By using convolution techniques from van Rooij (2020), we improve upon the previous (M+1)^{3tw}^𝒪(1) time algorithm by Arulselvan et al. from 2018. We prove that this algorithm is essentially optimal for all cases that are not trivial or polynomial time solvable for the decision, minimization or maximization versions. Our lower bounds show that such an improvement is not even possible for B-Factor, which is General Factor on graphs where all sets B_v agree with the fixed set B. We show that for every fixed B where the problem is NP-hard, our (max B+1)^tw^𝒪(1) algorithm cannot be significantly improved: assuming the Strong Exponential Time Hypothesis (SETH), no algorithm can solve B-Factor in time (max B+1-ε)^tw^𝒪(1) for any ε > 0. We extend this bound to the counting version of B-Factor for arbitrary, non-trivial sets B, assuming #SETH. We also investigate the parameterization of the problem by cutwidth. Unlike for treewidth, having a larger set B does not appear to make the problem harder: we give a 2^cutw^𝒪(1) algorithm for any B and provide a matching lower bound that this is optimal for the NP-hard cases.

Cite as

Dániel Marx, Govind S. Sankar, and Philipp Schepper. Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 95:1-95:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2021.95,
  author =	{Marx, D\'{a}niel and Sankar, Govind S. and Schepper, Philipp},
  title =	{{Degrees and Gaps: Tight Complexity Results of General Factor Problems Parameterized by Treewidth and Cutwidth}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{95:1--95:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.95},
  URN =		{urn:nbn:de:0030-drops-141647},
  doi =		{10.4230/LIPIcs.ICALP.2021.95},
  annote =	{Keywords: General Factor, General Matching, Treewidth, Cutwidth}
}
Document
On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting

Authors: Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram

Published in: LIPIcs, Volume 192, 2nd Symposium on Foundations of Responsible Computing (FORC 2021)


Abstract
Redistricting is the problem of dividing up a state into a given number k of regions (called districts) where the voters in each district are to elect a representative. The three primary criteria are: that each district be connected, that the populations of the districts be equal (or nearly equal), and that the districts are "compact". There are multiple competing definitions of compactness, usually minimizing some quantity. One measure that has been recently been used is number of cut edges. In this formulation of redistricting, one is given atomic regions out of which each district must be built (e.g., in the U.S., census blocks). The populations of the atomic regions are given. Consider the graph with one vertex per atomic region and an edge between atomic regions with a shared boundary of positive length. Define the weight of a vertex to be the population of the corresponding region. A districting plan is a partition of vertices into k pieces so that the parts have nearly equal weights and each part is connected. The districts are considered compact to the extent that the plan minimizes the number of edges crossing between different parts. There are two natural computational problems: find the most compact districting plan, and sample districting plans (possibly under a compactness constraint) uniformly at random. Both problems are NP-hard so we consider restricting the input graph to have branchwidth at most w. (A planar graph’s branchwidth is bounded, for example, by its diameter.) If both k and w are bounded by constants, the problems are solvable in polynomial time. In this paper, we give lower and upper bounds that characterize the complexity of these problems in terms of parameters k and w. For simplicity of notation, assume that each vertex has unit weight. We would ideally like algorithms whose running times are of the form O(f(k,w) n^c) for some constant c independent of k and w (in which case the problems are said to be fixed-parameter tractable with respect to those parameters). We show that, under standard complexity-theoretic assumptions, no such algorithms exist. However, the problems are fixed-parameter tractable with respect to each of these parameters individually: there exist algorithms with running times of the form O(f(k) n^{O(w)}) and O(f(w) n^{k+1}). The first result was previously known. The new one, however, is more relevant to the application to redistricting, at least for coarse instances. Indeed, we have implemented a version of the algorithm and have used to successfully find optimally compact solutions to all redistricting instances for France (except Paris, which operates under different rules) under various population-balance constraints. For these instances, the values for w are modest and the values for k are very small.

Cite as

Vincent Cohen-Addad, Philip N. Klein, Dániel Marx, Archer Wheeler, and Christopher Wolfram. On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting. In 2nd Symposium on Foundations of Responsible Computing (FORC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 192, pp. 3:1-3:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cohenaddad_et_al:LIPIcs.FORC.2021.3,
  author =	{Cohen-Addad, Vincent and Klein, Philip N. and Marx, D\'{a}niel and Wheeler, Archer and Wolfram, Christopher},
  title =	{{On the Computational Tractability of a Geographic Clustering Problem Arising in Redistricting}},
  booktitle =	{2nd Symposium on Foundations of Responsible Computing (FORC 2021)},
  pages =	{3:1--3:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-187-0},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{192},
  editor =	{Ligett, Katrina and Gupta, Swati},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.FORC.2021.3},
  URN =		{urn:nbn:de:0030-drops-138718},
  doi =		{10.4230/LIPIcs.FORC.2021.3},
  annote =	{Keywords: redistricting, algorithms, planar graphs, lower bounds}
}
Document
Chordless Cycle Packing Is Fixed-Parameter Tractable

Authors: Dániel Marx

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
A chordless cycle or hole in a graph G is an induced cycle of length at least 4. In the Hole Packing problem, a graph G and an integer k is given, and the task is to find (if exists) a set of k pairwise vertex-disjoint chordless cycles. Our main result is showing that Hole Packing is fixed-parameter tractable (FPT), that is, can be solved in time f(k)n^O(1) for some function f depending only on k.

Cite as

Dániel Marx. Chordless Cycle Packing Is Fixed-Parameter Tractable. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 71:1-71:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.ESA.2020.71,
  author =	{Marx, D\'{a}niel},
  title =	{{Chordless Cycle Packing Is Fixed-Parameter Tractable}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{71:1--71:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.71},
  URN =		{urn:nbn:de:0030-drops-129373},
  doi =		{10.4230/LIPIcs.ESA.2020.71},
  annote =	{Keywords: chordal graphs, packing, fixed-parameter tractability}
}
Document
Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy

Authors: Dániel Marx and R. B. Sandeep

Published in: LIPIcs, Volume 173, 28th Annual European Symposium on Algorithms (ESA 2020)


Abstract
Given a graph G and an integer k, the H-free Edge Editing problem is to find whether there exist at most k pairs of vertices in G such that changing the adjacency of the pairs in G results in a graph without any induced copy of H. The existence of polynomial kernels for H-free Edge Editing (that is, whether it is possible to reduce the size of the instance to k^O(1) in polynomial time) received significant attention in the parameterized complexity literature. Nontrivial polynomial kernels are known to exist for some graphs H with at most 4 vertices (e.g., path on 3 or 4 vertices, diamond, paw), but starting from 5 vertices, polynomial kernels are known only if H is either complete or empty. This suggests the conjecture that there is no other H with at least 5 vertices were H-free Edge Editing admits a polynomial kernel. Towards this goal, we obtain a set ℋ of nine 5-vertex graphs such that if for every H ∈ ℋ, H-free Edge Editing is incompressible and the complexity assumption NP ⊈ coNP/poly holds, then H-free Edge Editing is incompressible for every graph H with at least five vertices that is neither complete nor empty. That is, proving incompressibility for these nine graphs would give a complete classification of the kernelization complexity of H-free Edge Editing for every H with at least 5 vertices. We obtain similar result also for H-free Edge Deletion. Here the picture is more complicated due to the existence of another infinite family of graphs H where the problem is trivial (graphs with exactly one edge). We obtain a larger set ℋ of nineteen graphs whose incompressibility would give a complete classification of the kernelization complexity of H-free Edge Deletion for every graph H with at least 5 vertices. Analogous results follow also for the H-free Edge Completion problem by simple complementation.

Cite as

Dániel Marx and R. B. Sandeep. Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy. In 28th Annual European Symposium on Algorithms (ESA 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 173, pp. 72:1-72:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ESA.2020.72,
  author =	{Marx, D\'{a}niel and Sandeep, R. B.},
  title =	{{Incompressibility of H-Free Edge Modification Problems: Towards a Dichotomy}},
  booktitle =	{28th Annual European Symposium on Algorithms (ESA 2020)},
  pages =	{72:1--72:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-162-7},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{173},
  editor =	{Grandoni, Fabrizio and Herman, Grzegorz and Sanders, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2020.72},
  URN =		{urn:nbn:de:0030-drops-129383},
  doi =		{10.4230/LIPIcs.ESA.2020.72},
  annote =	{Keywords: incompressibility, edge modification problems, H-free graphs}
}
Document
Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction

Authors: Marvin Künnemann and Dániel Marx

Published in: LIPIcs, Volume 169, 35th Computational Complexity Conference (CCC 2020)


Abstract
To study the question under which circumstances small solutions can be found faster than by exhaustive search (and by how much), we study the fine-grained complexity of Boolean constraint satisfaction with size constraint exactly k. More precisely, we aim to determine, for any finite constraint family, the optimal running time f(k)n^g(k) required to find satisfying assignments that set precisely k of the n variables to 1. Under central hardness assumptions on detecting cliques in graphs and 3-uniform hypergraphs, we give an almost tight characterization of g(k) into four regimes: 1) Brute force is essentially best-possible, i.e., g(k) = (1 ± o(1))k, 2) the best algorithms are as fast as current k-clique algorithms, i.e., g(k) = (ω/3 ± o(1))k, 3) the exponent has sublinear dependence on k with g(k) ∈ [Ω(∛k), O(√k)], or 4) the problem is fixed-parameter tractable, i.e., g(k) = O(1). This yields a more fine-grained perspective than a previous FPT/W[1]-hardness dichotomy (Marx, Computational Complexity 2005). Our most interesting technical contribution is a f(k)n^(4√k)-time algorithm for SubsetSum with precedence constraints parameterized by the target k - particularly the approach, based on generalizing a bound on the Frobenius coin problem to a setting with precedence constraints, might be of independent interest.

Cite as

Marvin Künnemann and Dániel Marx. Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction. In 35th Computational Complexity Conference (CCC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 169, pp. 27:1-27:28, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kunnemann_et_al:LIPIcs.CCC.2020.27,
  author =	{K\"{u}nnemann, Marvin and Marx, D\'{a}niel},
  title =	{{Finding Small Satisfying Assignments Faster Than Brute Force: A Fine-Grained Perspective into Boolean Constraint Satisfaction}},
  booktitle =	{35th Computational Complexity Conference (CCC 2020)},
  pages =	{27:1--27:28},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-156-6},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{169},
  editor =	{Saraf, Shubhangi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CCC.2020.27},
  URN =		{urn:nbn:de:0030-drops-125791},
  doi =		{10.4230/LIPIcs.CCC.2020.27},
  annote =	{Keywords: Fine-grained complexity theory, algorithmic classification theorem, multivariate algorithms and complexity, constraint satisfaction problems, satisfiability}
}
Document
Track A: Algorithms, Complexity and Games
Hitting Long Directed Cycles Is Fixed-Parameter Tractable

Authors: Alexander Göke, Dániel Marx, and Matthias Mnich

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
In the Directed Long Cycle Hitting Set problem we are given a directed graph G, and the task is to find a set S of at most k vertices/arcs such that G-S has no cycle of length longer than ℓ. We show that the problem can be solved in time 2^O(ℓ^6 + ℓ k^3 log k + k^5 log k log ℓ) ⋅ n^O(1), that is, it is fixed-parameter tractable (FPT) parameterized by k and ℓ. This algorithm can be seen as a far-reaching generalization of the fixed-parameter tractability of Mixed Graph Feedback Vertex Set [Bonsma and Lokshtanov WADS 2011], which is already a common generalization of the fixed-parameter tractability of (undirected) Feedback Vertex Set and the Directed Feedback Vertex Set problems, two classic results in parameterized algorithms. The algorithm requires significant insights into the structure of graphs without directed cycles of length longer than ℓ and can be seen as an exact version of the approximation algorithm following from the Erdős-Pósa property for long cycles in directed graphs proved by Kreutzer and Kawarabayashi [STOC 2015].

Cite as

Alexander Göke, Dániel Marx, and Matthias Mnich. Hitting Long Directed Cycles Is Fixed-Parameter Tractable. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 59:1-59:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{goke_et_al:LIPIcs.ICALP.2020.59,
  author =	{G\"{o}ke, Alexander and Marx, D\'{a}niel and Mnich, Matthias},
  title =	{{Hitting Long Directed Cycles Is Fixed-Parameter Tractable}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{59:1--59:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.59},
  URN =		{urn:nbn:de:0030-drops-124664},
  doi =		{10.4230/LIPIcs.ICALP.2020.59},
  annote =	{Keywords: Directed graphs, directed feedback vertex set, circumference}
}
Document
Finding and Counting Permutations via CSPs

Authors: Benjamin Aram Berendsohn, László Kozma, and Dániel Marx

Published in: LIPIcs, Volume 148, 14th International Symposium on Parameterized and Exact Computation (IPEC 2019)


Abstract
Permutation patterns and pattern avoidance have been intensively studied in combinatorics and computer science, going back at least to the seminal work of Knuth on stack-sorting (1968). Perhaps the most natural algorithmic question in this area is deciding whether a given permutation of length n contains a given pattern of length k. In this work we give two new algorithms for this well-studied problem, one whose running time is n^{k/4 + o(k)}, and a polynomial-space algorithm whose running time is the better of O(1.6181^n) and O(n^{k/2 + 1}). These results improve the earlier best bounds of n^{0.47k + o(k)} and O(1.79^n) due to Ahal and Rabinovich (2000) resp. Bruner and Lackner (2012) and are the fastest algorithms for the problem when k in Omega(log{n}). We show that both our new algorithms and the previous exponential-time algorithms in the literature can be viewed through the unifying lens of constraint-satisfaction. Our algorithms can also count, within the same running time, the number of occurrences of a pattern. We show that this result is close to optimal: solving the counting problem in time f(k) * n^{o(k/log{k})} would contradict the exponential-time hypothesis (ETH). For some special classes of patterns we obtain improved running times. We further prove that 3-increasing and 3-decreasing permutations can, in some sense, embed arbitrary permutations of almost linear length, which indicates that an algorithm with sub-exponential running time is unlikely, even for patterns from these restricted classes.

Cite as

Benjamin Aram Berendsohn, László Kozma, and Dániel Marx. Finding and Counting Permutations via CSPs. In 14th International Symposium on Parameterized and Exact Computation (IPEC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 148, pp. 1:1-1:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{berendsohn_et_al:LIPIcs.IPEC.2019.1,
  author =	{Berendsohn, Benjamin Aram and Kozma, L\'{a}szl\'{o} and Marx, D\'{a}niel},
  title =	{{Finding and Counting Permutations via CSPs}},
  booktitle =	{14th International Symposium on Parameterized and Exact Computation (IPEC 2019)},
  pages =	{1:1--1:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-129-0},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{148},
  editor =	{Jansen, Bart M. P. and Telle, Jan Arne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2019.1},
  URN =		{urn:nbn:de:0030-drops-114627},
  doi =		{10.4230/LIPIcs.IPEC.2019.1},
  annote =	{Keywords: permutations, pattern matching, constraint satisfaction, exponential time}
}
Document
Parameterized Streaming Algorithms for Min-Ones d-SAT

Authors: Akanksha Agrawal, Arindam Biswas, Édouard Bonnet, Nick Brettell, Radu Curticapean, Dániel Marx, Tillmann Miltzow, Venkatesh Raman, and Saket Saurabh

Published in: LIPIcs, Volume 150, 39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019)


Abstract
In this work, we initiate the study of the Min-Ones d-SAT problem in the parameterized streaming model. An instance of the problem consists of a d-CNF formula F and an integer k, and the objective is to determine if F has a satisfying assignment which sets at most k variables to 1. In the parameterized streaming model, input is provided as a stream, just as in the usual streaming model. A key difference is that the bound on the read-write memory available to the algorithm is O(f(k) log n) (f: N -> N, a computable function) as opposed to the O(log n) bound of the usual streaming model. The other important difference is that the number of passes the algorithm makes over its input must be a (preferably small) function of k. We design a (k + 1)-pass parameterized streaming algorithm that solves Min-Ones d-SAT (d >= 2) using space O((kd^(ck) + k^d)log n) (c > 0, a constant) and a (d + 1)^k-pass algorithm that uses space O(k log n). We also design a streaming kernelization for Min-Ones 2-SAT that makes (k + 2) passes and uses space O(k^6 log n) to produce a kernel with O(k^6) clauses. To complement these positive results, we show that any k-pass algorithm for or Min-Ones d-SAT (d >= 2) requires space Omega(max{n^(1/k) / 2^k, log(n / k)}) on instances (F, k). This is achieved via a reduction from the streaming problem POT Pointer Chasing (Guha and McGregor [ICALP 2008]), which might be of independent interest. Given this, our (k + 1)-pass parameterized streaming algorithm is the best possible, inasmuch as the number of passes is concerned. In contrast to the results of Fafianie and Kratsch [MFCS 2014] and Chitnis et al. [SODA 2015], who independently showed that there are 1-pass parameterized streaming algorithms for Vertex Cover (a restriction of Min-Ones 2-SAT), we show using lower bounds from Communication Complexity that for any d >= 1, a 1-pass streaming algorithm for Min-Ones d-SAT requires space Omega(n). This excludes the possibility of a 1-pass parameterized streaming algorithm for the problem. Additionally, we show that any p-pass algorithm for the problem requires space Omega(n/p).

Cite as

Akanksha Agrawal, Arindam Biswas, Édouard Bonnet, Nick Brettell, Radu Curticapean, Dániel Marx, Tillmann Miltzow, Venkatesh Raman, and Saket Saurabh. Parameterized Streaming Algorithms for Min-Ones d-SAT. In 39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 150, pp. 8:1-8:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{agrawal_et_al:LIPIcs.FSTTCS.2019.8,
  author =	{Agrawal, Akanksha and Biswas, Arindam and Bonnet, \'{E}douard and Brettell, Nick and Curticapean, Radu and Marx, D\'{a}niel and Miltzow, Tillmann and Raman, Venkatesh and Saurabh, Saket},
  title =	{{Parameterized Streaming Algorithms for Min-Ones d-SAT}},
  booktitle =	{39th IARCS Annual Conference on Foundations of Software Technology and Theoretical Computer Science (FSTTCS 2019)},
  pages =	{8:1--8:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-131-3},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{150},
  editor =	{Chattopadhyay, Arkadev and Gastin, Paul},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSTTCS.2019.8},
  URN =		{urn:nbn:de:0030-drops-115708},
  doi =		{10.4230/LIPIcs.FSTTCS.2019.8},
  annote =	{Keywords: min, ones, sat, d-sat, parameterized, kernelization, streaming, space, efficient, algorithm, parameter}
}
Document
How Does Object Fatness Impact the Complexity of Packing in d Dimensions?

Authors: Sándor Kisfaludi-Bak, Dániel Marx, and Tom C. van der Zanden

Published in: LIPIcs, Volume 149, 30th International Symposium on Algorithms and Computation (ISAAC 2019)


Abstract
Packing is a classical problem where one is given a set of subsets of Euclidean space called objects, and the goal is to find a maximum size subset of objects that are pairwise non-intersecting. The problem is also known as the Independent Set problem on the intersection graph defined by the objects. Although the problem is NP-complete, there are several subexponential algorithms in the literature. One of the key assumptions of such algorithms has been that the objects are fat, with a few exceptions in two dimensions; for example, the packing problem of a set of polygons in the plane surprisingly admits a subexponential algorithm. In this paper we give tight running time bounds for packing similarly-sized non-fat objects in higher dimensions. We propose an alternative and very weak measure of fatness called the stabbing number, and show that the packing problem in Euclidean space of constant dimension d >=slant 3 for a family of similarly sized objects with stabbing number alpha can be solved in 2^O(n^(1-1/d) alpha) time. We prove that even in the case of axis-parallel boxes of fixed shape, there is no 2^o(n^(1-1/d) alpha) algorithm under ETH. This result smoothly bridges the whole range of having constant-fat objects on one extreme (alpha=1) and a subexponential algorithm of the usual running time, and having very "skinny" objects on the other extreme (alpha=n^(1/d)), where we cannot hope to improve upon the brute force running time of 2^O(n), and thereby characterizes the impact of fatness on the complexity of packing in case of similarly sized objects. We also study the same problem when parameterized by the solution size k, and give a n^O(k^(1-1/d) alpha) algorithm, with an almost matching lower bound: there is no algorithm with running time of the form f(k) n^o(k^(1-1/d) alpha/log k) under ETH. One of our main tools in these reductions is a new wiring theorem that may be of independent interest.

Cite as

Sándor Kisfaludi-Bak, Dániel Marx, and Tom C. van der Zanden. How Does Object Fatness Impact the Complexity of Packing in d Dimensions?. In 30th International Symposium on Algorithms and Computation (ISAAC 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 149, pp. 36:1-36:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kisfaludibak_et_al:LIPIcs.ISAAC.2019.36,
  author =	{Kisfaludi-Bak, S\'{a}ndor and Marx, D\'{a}niel and van der Zanden, Tom C.},
  title =	{{How Does Object Fatness Impact the Complexity of Packing in d Dimensions?}},
  booktitle =	{30th International Symposium on Algorithms and Computation (ISAAC 2019)},
  pages =	{36:1--36:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-130-6},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{149},
  editor =	{Lu, Pinyan and Zhang, Guochuan},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2019.36},
  URN =		{urn:nbn:de:0030-drops-115327},
  doi =		{10.4230/LIPIcs.ISAAC.2019.36},
  annote =	{Keywords: Geometric intersection graph, Independent Set, Object fatness}
}
Document
New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041)

Authors: Fedor V. Fomin, Dániel Marx, Saket Saurabh, and Meirav Zehavi

Published in: Dagstuhl Reports, Volume 9, Issue 1 (2019)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 19041 "New Horizons in Parameterized Complexity". Parameterized Complexity is celebrating its 30th birthday in 2019. In these three decades, there has been tremendous progress in developing the area. The central vision of Parameterized Complexity through all these years has been to provide the algorithmic and complexity-theoretic toolkit for studying multivariate algorithmics in different disciplines and subfields of Computer Science. These tools are universal as they did not only help in the development of the core of Parameterized Complexity, but also led to its success in other subfields of Computer Science such as Approximation Algorithms, Computational Social Choice, Computational Geometry, problems solvable in P (polynomial time), to name a few. In the last few years, we have witnessed several exciting developments of new parameterized techniques and tools in the following subfields of Computer Science and Optimization: Mathematical Programming, Computational Linear Algebra, Computational Counting, Derandomization, and Approximation Algorithms. The main objective of the seminar was to initiate the discussion on which of the recent domain-specific algorithms and complexity advances can become useful in other domains.

Cite as

Fedor V. Fomin, Dániel Marx, Saket Saurabh, and Meirav Zehavi. New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041). In Dagstuhl Reports, Volume 9, Issue 1, pp. 67-87, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@Article{fomin_et_al:DagRep.9.1.67,
  author =	{Fomin, Fedor V. and Marx, D\'{a}niel and Saurabh, Saket and Zehavi, Meirav},
  title =	{{New Horizons in Parameterized Complexity (Dagstuhl Seminar 19041)}},
  pages =	{67--87},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2019},
  volume =	{9},
  number =	{1},
  editor =	{Fomin, Fedor V. and Marx, D\'{a}niel and Saurabh, Saket and Zehavi, Meirav},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.9.1.67},
  URN =		{urn:nbn:de:0030-drops-105706},
  doi =		{10.4230/DagRep.9.1.67},
  annote =	{Keywords: Intractability, Parameterized Complexity}
}
Document
Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs

Authors: Vincent Cohen-Addad, Éric Colin de Verdière, Dániel Marx, and Arnaud de Mesmay

Published in: LIPIcs, Volume 129, 35th International Symposium on Computational Geometry (SoCG 2019)


Abstract
We prove essentially tight lower bounds, conditionally to the Exponential Time Hypothesis, for two fundamental but seemingly very different cutting problems on surface-embedded graphs: the Shortest Cut Graph problem and the Multiway Cut problem. A cut graph of a graph G embedded on a surface S is a subgraph of G whose removal from S leaves a disk. We consider the problem of deciding whether an unweighted graph embedded on a surface of genus g has a cut graph of length at most a given value. We prove a time lower bound for this problem of n^{Omega(g/log g)} conditionally to ETH. In other words, the first n^{O(g)}-time algorithm by Erickson and Har-Peled [SoCG 2002, Discr. Comput. Geom. 2004] is essentially optimal. We also prove that the problem is W[1]-hard when parameterized by the genus, answering a 17-year old question of these authors. A multiway cut of an undirected graph G with t distinguished vertices, called terminals, is a set of edges whose removal disconnects all pairs of terminals. We consider the problem of deciding whether an unweighted graph G has a multiway cut of weight at most a given value. We prove a time lower bound for this problem of n^{Omega(sqrt{gt + g^2}/log(gt))}, conditionally to ETH, for any choice of the genus g >=0 of the graph and the number of terminals t >=4. In other words, the algorithm by the second author [Algorithmica 2017] (for the more general multicut problem) is essentially optimal; this extends the lower bound by the third author [ICALP 2012] (for the planar case). Reductions to planar problems usually involve a grid-like structure. The main novel idea for our results is to understand what structures instead of grids are needed if we want to exploit optimally a certain value g of the genus.

Cite as

Vincent Cohen-Addad, Éric Colin de Verdière, Dániel Marx, and Arnaud de Mesmay. Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs. In 35th International Symposium on Computational Geometry (SoCG 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 129, pp. 27:1-27:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{cohenaddad_et_al:LIPIcs.SoCG.2019.27,
  author =	{Cohen-Addad, Vincent and Colin de Verdi\`{e}re, \'{E}ric and Marx, D\'{a}niel and de Mesmay, Arnaud},
  title =	{{Almost Tight Lower Bounds for Hard Cutting Problems in Embedded Graphs}},
  booktitle =	{35th International Symposium on Computational Geometry (SoCG 2019)},
  pages =	{27:1--27:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-104-7},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{129},
  editor =	{Barequet, Gill and Wang, Yusu},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2019.27},
  URN =		{urn:nbn:de:0030-drops-104311},
  doi =		{10.4230/LIPIcs.SoCG.2019.27},
  annote =	{Keywords: Cut graph, Multiway cut, Surface, Lower bound, Parameterized Complexity, Exponential Time Hypothesis}
}
Document
Multi-Budgeted Directed Cuts

Authors: Stefan Kratsch, Shaohua Li, Dániel Marx, Marcin Pilipczuk, and Magnus Wahlström

Published in: LIPIcs, Volume 115, 13th International Symposium on Parameterized and Exact Computation (IPEC 2018)


Abstract
In this paper, we study multi-budgeted variants of the classic minimum cut problem and graph separation problems that turned out to be important in parameterized complexity: Skew Multicut and Directed Feedback Arc Set. In our generalization, we assign colors 1,2,...,l to some edges and give separate budgets k_1,k_2,...,k_l for colors 1,2,...,l. For every color i in {1,...,l}, let E_i be the set of edges of color i. The solution C for the multi-budgeted variant of a graph separation problem not only needs to satisfy the usual separation requirements (i.e., be a cut, a skew multicut, or a directed feedback arc set, respectively), but also needs to satisfy that |C cap E_i| <= k_i for every i in {1,...,l}. Contrary to the classic minimum cut problem, the multi-budgeted variant turns out to be NP-hard even for l = 2. We propose FPT algorithms parameterized by k=k_1 +...+ k_l for all three problems. To this end, we develop a branching procedure for the multi-budgeted minimum cut problem that measures the progress of the algorithm not by reducing k as usual, by but elevating the capacity of some edges and thus increasing the size of maximum source-to-sink flow. Using the fact that a similar strategy is used to enumerate all important separators of a given size, we merge this process with the flow-guided branching and show an FPT bound on the number of (appropriately defined) important multi-budgeted separators. This allows us to extend our algorithm to the Skew Multicut and Directed Feedback Arc Set problems. Furthermore, we show connections of the multi-budgeted variants with weighted variants of the directed cut problems and the Chain l-SAT problem, whose parameterized complexity remains an open problem. We show that these problems admit a bounded-in-parameter number of "maximally pushed" solutions (in a similar spirit as important separators are maximally pushed), giving somewhat weak evidence towards their tractability.

Cite as

Stefan Kratsch, Shaohua Li, Dániel Marx, Marcin Pilipczuk, and Magnus Wahlström. Multi-Budgeted Directed Cuts. In 13th International Symposium on Parameterized and Exact Computation (IPEC 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 115, pp. 18:1-18:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{kratsch_et_al:LIPIcs.IPEC.2018.18,
  author =	{Kratsch, Stefan and Li, Shaohua and Marx, D\'{a}niel and Pilipczuk, Marcin and Wahlstr\"{o}m, Magnus},
  title =	{{Multi-Budgeted Directed Cuts}},
  booktitle =	{13th International Symposium on Parameterized and Exact Computation (IPEC 2018)},
  pages =	{18:1--18:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-084-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{115},
  editor =	{Paul, Christophe and Pilipczuk, Michal},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2018.18},
  URN =		{urn:nbn:de:0030-drops-102194},
  doi =		{10.4230/LIPIcs.IPEC.2018.18},
  annote =	{Keywords: important separators, multi-budgeted cuts, Directed Feedback Vertex Set, fixed-parameter tractability, minimum cut}
}
Document
Complete Volume
LIPIcs, Volume 107, ICALP'18, Complete Volume

Authors: Ioannis Chatzigiannakis, Christos Kaklamanis, Dániel Marx, and Donald Sannella

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
LIPIcs, Volume 107, ICALP'18, Complete Volume

Cite as

45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@Proceedings{chatzigiannakis_et_al:LIPIcs.ICALP.2018,
  title =	{{LIPIcs, Volume 107, ICALP'18, Complete Volume}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018},
  URN =		{urn:nbn:de:0030-drops-92803},
  doi =		{10.4230/LIPIcs.ICALP.2018},
  annote =	{Keywords: Theory of computation}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Ioannis Chatzigiannakis, Christos Kaklamanis, Dániel Marx, and Donald Sannella

Published in: LIPIcs, Volume 107, 45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

45th International Colloquium on Automata, Languages, and Programming (ICALP 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 107, pp. 0:i-0:xlviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{chatzigiannakis_et_al:LIPIcs.ICALP.2018.0,
  author =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{45th International Colloquium on Automata, Languages, and Programming (ICALP 2018)},
  pages =	{0:i--0:xlviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-076-7},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{107},
  editor =	{Chatzigiannakis, Ioannis and Kaklamanis, Christos and Marx, D\'{a}niel and Sannella, Donald},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2018.0},
  URN =		{urn:nbn:de:0030-drops-90049},
  doi =		{10.4230/LIPIcs.ICALP.2018.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
The Parameterized Hardness of the k-Center Problem in Transportation Networks

Authors: Andreas Emil Feldmann and Dániel Marx

Published in: LIPIcs, Volume 101, 16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018)


Abstract
In this paper we study the hardness of the k-Center problem on inputs that model transportation networks. For the problem, an edge-weighted graph G=(V,E) and an integer k are given and a center set C subseteq V needs to be chosen such that |C|<= k. The aim is to minimize the maximum distance of any vertex in the graph to the closest center. This problem arises in many applications of logistics, and thus it is natural to consider inputs that model transportation networks. Such inputs are often assumed to be planar graphs, low doubling metrics, or bounded highway dimension graphs. For each of these models, parameterized approximation algorithms have been shown to exist. We complement these results by proving that the k-Center problem is W[1]-hard on planar graphs of constant doubling dimension, where the parameter is the combination of the number of centers k, the highway dimension h, and even the treewidth t. Moreover, under the Exponential Time Hypothesis there is no f(k,t,h)* n^{o(t+sqrt{k+h})} time algorithm for any computable function f. Thus it is unlikely that the optimum solution to k-Center can be found efficiently, even when assuming that the input graph abides to all of the above models for transportation networks at once! Additionally we give a simple parameterized (1+{epsilon})-approximation algorithm for inputs of doubling dimension d with runtime (k^k/{epsilon}^{O(kd)})* n^{O(1)}. This generalizes a previous result, which considered inputs in D-dimensional L_q metrics.

Cite as

Andreas Emil Feldmann and Dániel Marx. The Parameterized Hardness of the k-Center Problem in Transportation Networks. In 16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 101, pp. 19:1-19:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{feldmann_et_al:LIPIcs.SWAT.2018.19,
  author =	{Feldmann, Andreas Emil and Marx, D\'{a}niel},
  title =	{{The Parameterized Hardness of the k-Center Problem in Transportation Networks}},
  booktitle =	{16th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2018)},
  pages =	{19:1--19:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-068-2},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{101},
  editor =	{Eppstein, David},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2018.19},
  URN =		{urn:nbn:de:0030-drops-88450},
  doi =		{10.4230/LIPIcs.SWAT.2018.19},
  annote =	{Keywords: k-center, parameterized complexity, planar graphs, doubling dimension, highway dimension, treewidth}
}
Document
Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms

Authors: Édouard Bonnet, Nick Brettell, O-joung Kwon, and Dániel Marx

Published in: LIPIcs, Volume 89, 12th International Symposium on Parameterized and Exact Computation (IPEC 2017)


Abstract
It has long been known that Feedback Vertex Set can be solved in time 2^O(w log w)n^O(1) on graphs of treewidth w, but it was only recently that this running time was improved to 2^O(w)n^O(1), that is, to single-exponential parameterized by treewidth. We investigate which generalizations of Feedback Vertex Set can be solved in a similar running time. Formally, for a class of graphs P, Bounded P-Block Vertex Deletion asks, given a graph G on n vertices and positive integers k and d, whether G contains a set S of at most k vertices such that each block of G-S has at most d vertices and is in P. Assuming that P is recognizable in polynomial time and satisfies a certain natural hereditary condition, we give a sharp characterization of when single-exponential parameterized algorithms are possible for fixed values of d: - if P consists only of chordal graphs, then the problem can be solved in time 2^O(wd^2) n^{O}(1), - if P contains a graph with an induced cycle of length ell>= 4, then the problem is not solvable in time 2^{o(w log w)} n^O(1)} even for fixed d=ell, unless the ETH fails. We also study a similar problem, called Bounded P-Component Vertex Deletion, where the target graphs have connected components of small size instead of having blocks of small size, and present analogous results.

Cite as

Édouard Bonnet, Nick Brettell, O-joung Kwon, and Dániel Marx. Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms. In 12th International Symposium on Parameterized and Exact Computation (IPEC 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 89, pp. 7:1-7:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.IPEC.2017.7,
  author =	{Bonnet, \'{E}douard and Brettell, Nick and Kwon, O-joung and Marx, D\'{a}niel},
  title =	{{Generalized Feedback Vertex Set Problems on Bounded-Treewidth Graphs: Chordality Is the Key to Single-Exponential Parameterized Algorithms}},
  booktitle =	{12th International Symposium on Parameterized and Exact Computation (IPEC 2017)},
  pages =	{7:1--7:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-051-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{89},
  editor =	{Lokshtanov, Daniel and Nishimura, Naomi},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2017.7},
  URN =		{urn:nbn:de:0030-drops-85653},
  doi =		{10.4230/LIPIcs.IPEC.2017.7},
  annote =	{Keywords: fixed-parameter tractable algorithms, treewidth, feedback vertex set}
}
Document
Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization

Authors: László Egri, Dániel Marx, and Pawel Rzazewski

Published in: LIPIcs, Volume 96, 35th Symposium on Theoretical Aspects of Computer Science (STACS 2018)


Abstract
In the list homomorphism problem, the input consists of two graphs G and H, together with a list L(v) \subseteq V(H) for every vertex v \in V(G). The task is to find a homomorphism phi:V(G) -> V(H) respecting the lists, that is, we have that phi(v) \in L(v) for every v \in V(H) and if u and v are adjacent in G, then phi(u) and phi(v) are adjacent in H. If H is a fixed graph, then the problem is denoted LHom(H). We consider the reflexive version of the problem, where we assume that every vertex in H has a self-loop. If is known that reflexive LHom(H) is polynomial-time solvable if H is an interval graph and it is NP-complete otherwise [Feder and Hell, JCTB 1998]. We explore the complexity of the problem parameterized by the treewidth tw(G) of the input graph G. If a tree decomposition of G of width tw(G) is given in the input, then the problem can be solved in time |V(H)|^{tw(G)} n^{O(1)} by naive dynamic programming. Our main result completely reveals when and by exactly how much this naive algorithm can be improved. We introduce a simple combinatorial invariant i^*(H), which is based on the existence of decompositions and incomparable sets, and show that this number should appear as the base of the exponent in the best possible running time. Specifically, we prove for every fixed non-interval graph H that * If a tree decomposition of width tw(G) is given in the input, then the problem can be solved in time i^*(H)^{tw(G)} n^{O(1)}. * Assuming the Strong Exponential-Time Hypothesis (SETH), the probem cannot be solved in time (i^*(H)-epsilon)^{tw(G)} n^{O(1)} for any epsilon>0. Thus by matching upper and lower bounds, our result exactly characterizes for every fixed H the complexity of reflexive LHom(H) parameterized by treewidth.

Cite as

László Egri, Dániel Marx, and Pawel Rzazewski. Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization. In 35th Symposium on Theoretical Aspects of Computer Science (STACS 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 96, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{egri_et_al:LIPIcs.STACS.2018.27,
  author =	{Egri, L\'{a}szl\'{o} and Marx, D\'{a}niel and Rzazewski, Pawel},
  title =	{{Finding List Homomorphisms from Bounded-treewidth Graphs to Reflexive Graphs: a Complete Complexity Characterization}},
  booktitle =	{35th Symposium on Theoretical Aspects of Computer Science (STACS 2018)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-062-0},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{96},
  editor =	{Niedermeier, Rolf and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2018.27},
  URN =		{urn:nbn:de:0030-drops-84867},
  doi =		{10.4230/LIPIcs.STACS.2018.27},
  annote =	{Keywords: graph homomorphism, list homomorphism, reflexive graph, treewidth}
}
Document
Subexponential Parameterized Algorithms for Graphs of Polynomial Growth

Authors: Dániel Marx and Marcin Pilipczuk

Published in: LIPIcs, Volume 87, 25th Annual European Symposium on Algorithms (ESA 2017)


Abstract
We show that for a number of parameterized problems for which only 2^{O(k)} n^{O(1)} time algorithms are known on general graphs, subexponential parameterized algorithms with running time 2^{O(k^{1-1/(1+d)} log^2 k)} n^{O(1)} are possible for graphs of polynomial growth with growth rate (degree) d, that is, if we assume that every ball of radius r contains only O(r^d) vertices. The algorithms use the technique of low-treewidth pattern covering, introduced by Fomin et al. [FOCS 2016] for planar graphs; here we show how this strategy can be made to work for graphs of polynomial growth. Formally, we prove that, given a graph G of polynomial growth with growth rate d and an integer k, one can in randomized polynomial time find a subset A of V(G) such that on one hand the treewidth of G[A] is O(k^{1-1/(1+d)} log k), and on the other hand for every set X of vertices of size at most k, the probability that X is a subset of A is 2^{-O(k^{1-1/(1+d)} log^2 k)}. Together with standard dynamic programming techniques on graphs of bounded treewidth, this statement gives subexponential parameterized algorithms for a number of subgraph search problems, such as Long Path or Steiner Tree, in graphs of polynomial growth. We complement the algorithm with an almost tight lower bound for Long Path: unless the Exponential Time Hypothesis fails, no parameterized algorithm with running time 2^{k^{1-1/d-epsilon}}n^{O(1)} is possible for any positive epsilon and any integer d >= 3.

Cite as

Dániel Marx and Marcin Pilipczuk. Subexponential Parameterized Algorithms for Graphs of Polynomial Growth. In 25th Annual European Symposium on Algorithms (ESA 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 87, pp. 59:1-59:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ESA.2017.59,
  author =	{Marx, D\'{a}niel and Pilipczuk, Marcin},
  title =	{{Subexponential Parameterized Algorithms for Graphs of Polynomial Growth}},
  booktitle =	{25th Annual European Symposium on Algorithms (ESA 2017)},
  pages =	{59:1--59:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-049-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{87},
  editor =	{Pruhs, Kirk and Sohler, Christian},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2017.59},
  URN =		{urn:nbn:de:0030-drops-78162},
  doi =		{10.4230/LIPIcs.ESA.2017.59},
  annote =	{Keywords: polynomial growth, subexponential algorithm, low treewidth pattern covering}
}
Document
Fine-Grained Complexity of Coloring Unit Disks and Balls

Authors: Csaba Biró, Édouard Bonnet, Dániel Marx, Tillmann Miltzow, and Pawel Rzazewski

Published in: LIPIcs, Volume 77, 33rd International Symposium on Computational Geometry (SoCG 2017)


Abstract
On planar graphs, many classic algorithmic problems enjoy a certain "square root phenomenon" and can be solved significantly faster than what is known to be possible on general graphs: for example, Independent Set, 3-Coloring, Hamiltonian Cycle, Dominating Set can be solved in time 2^O(sqrt{n}) on an n-vertex planar graph, while no 2^o(n) algorithms exist for general graphs, assuming the Exponential Time Hypothesis (ETH). The square root in the exponent seems to be best possible for planar graphs: assuming the ETH, the running time for these problems cannot be improved to 2^o(sqrt{n}). In some cases, a similar speedup can be obtained for 2-dimensional geometric problems, for example, there are 2^O(sqrt{n}log n) time algorithms for Independent Set on unit disk graphs or for TSP on 2-dimensional point sets. In this paper, we explore whether such a speedup is possible for geometric coloring problems. On the one hand, geometric objects can behave similarly to planar graphs: 3-Coloring can be solved in time 2^O(sqrt{n}) on the intersection graph of n unit disks in the plane and, assuming the ETH, there is no such algorithm with running time 2^o(sqrt{n}). On the other hand, if the number L of colors is part of the input, then no such speedup is possible: Coloring the intersection graph of n unit disks with L colors cannot be solved in time 2^o(n), assuming the ETH. More precisely, we exhibit a smooth increase of complexity as the number L of colors increases: If we restrict the number of colors to L=Theta(n^alpha) for some 0<=alpha<=1, then the problem of coloring the intersection graph of n unit disks with L colors * can be solved in time exp(O(n^{{1+alpha}/2}log n))=exp( O(sqrt{nL}log n)), and * cannot be solved in time exp(o(n^{{1+alpha}/2}))=exp(o(sqrt{nL})), unless the ETH fails. More generally, we consider the problem of coloring d-dimensional unit balls in the Euclidean space and obtain analogous results showing that the problem * can be solved in time exp(O(n^{{d-1+alpha}/d}log n))=exp(O(n^{1-1/d}L^{1/d}log n)), and * cannot be solved in time exp(n^{{d-1+alpha}/d-epsilon})= exp (O(n^{1-1/d-epsilon}L^{1/d})) for any epsilon>0, unless the ETH fails.

Cite as

Csaba Biró, Édouard Bonnet, Dániel Marx, Tillmann Miltzow, and Pawel Rzazewski. Fine-Grained Complexity of Coloring Unit Disks and Balls. In 33rd International Symposium on Computational Geometry (SoCG 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 77, pp. 18:1-18:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{biro_et_al:LIPIcs.SoCG.2017.18,
  author =	{Bir\'{o}, Csaba and Bonnet, \'{E}douard and Marx, D\'{a}niel and Miltzow, Tillmann and Rzazewski, Pawel},
  title =	{{Fine-Grained Complexity of Coloring Unit Disks and Balls}},
  booktitle =	{33rd International Symposium on Computational Geometry (SoCG 2017)},
  pages =	{18:1--18:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-038-5},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{77},
  editor =	{Aronov, Boris and Katz, Matthew J.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2017.18},
  URN =		{urn:nbn:de:0030-drops-71800},
  doi =		{10.4230/LIPIcs.SoCG.2017.18},
  annote =	{Keywords: unit disk graphs, unit ball graphs, coloring, exact algorithm}
}
Document
Invited Talk
Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries (Invited Talk)

Authors: Dániel Marx

Published in: LIPIcs, Volume 68, 20th International Conference on Database Theory (ICDT 2017)


Abstract
The complexity of evaluating conjunctive queries can depend significantly on the structure of the query. For example, it is well known that various notions of acyclicity can make the evaluation problem tractable. More generally, it seems that the complexity is connected to the "treelikeness" of the graph or hypergraph describing the query structure. In the lecture, we will review some of the notions of treelikeness that were proposed in the literature and how they are relevant for the complexity of evaluating conjunctive queries and related problems.

Cite as

Dániel Marx. Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries (Invited Talk). In 20th International Conference on Database Theory (ICDT 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 68, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.ICDT.2017.2,
  author =	{Marx, D\'{a}niel},
  title =	{{Graphs, Hypergraphs, and the Complexity of Conjunctive Database Queries}},
  booktitle =	{20th International Conference on Database Theory (ICDT 2017)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-024-8},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{68},
  editor =	{Benedikt, Michael and Orsi, Giorgio},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICDT.2017.2},
  URN =		{urn:nbn:de:0030-drops-70652},
  doi =		{10.4230/LIPIcs.ICDT.2017.2},
  annote =	{Keywords: Conjunctive queries, treewidth, complexity}
}
Document
Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix

Authors: Lin Chen, Dániel Marx, Deshi Ye, and Guochuan Zhang

Published in: LIPIcs, Volume 66, 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)


Abstract
We study approximation and parameterized algorithms for R||C_max, focusing on the problem when the rank of the matrix formed by job processing times is small. Bhaskara et al. initiated the study of approximation algorithms with respect to the rank, showing that R||C_max admits a QPTAS (Quasi-polynomial time approximation scheme) when the rank is 2, and becomes APX-hard when the rank is 4. We continue this line of research. We prove that R||C_max is APX-hard even if the rank is 3, resolving an open problem. We then show that R||C_max is FPT parameterized by the rank and the largest job processing time p_max. This generalizes the parameterized results on P||C_max and R||C_max with few different types of machines. We also provide nearly tight lower bounds under Exponential Time Hypothesis which suggests that the running time of the FPT algorithm is unlikely to be improved significantly.

Cite as

Lin Chen, Dániel Marx, Deshi Ye, and Guochuan Zhang. Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix. In 34th Symposium on Theoretical Aspects of Computer Science (STACS 2017). Leibniz International Proceedings in Informatics (LIPIcs), Volume 66, pp. 22:1-22:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.STACS.2017.22,
  author =	{Chen, Lin and Marx, D\'{a}niel and Ye, Deshi and Zhang, Guochuan},
  title =	{{Parameterized and Approximation Results for Scheduling with a Low Rank Processing Time Matrix}},
  booktitle =	{34th Symposium on Theoretical Aspects of Computer Science (STACS 2017)},
  pages =	{22:1--22:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-028-6},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{66},
  editor =	{Vollmer, Heribert and Vall\'{e}e, Brigitte},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2017.22},
  URN =		{urn:nbn:de:0030-drops-70110},
  doi =		{10.4230/LIPIcs.STACS.2017.22},
  annote =	{Keywords: APX-hardness, Parameterized algorithm, Scheduling, Exponential Time Hypothesis}
}
Document
H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms

Authors: Gábor Bacsó, Dániel Marx, and Zsolt Tuza

Published in: LIPIcs, Volume 63, 11th International Symposium on Parameterized and Exact Computation (IPEC 2016)


Abstract
It is an outstanding open question in algorithmic graph theory to determine the complexity of the MAXIMUM INDEPENDENT SET problem on P_t-free graphs, that is, on graphs not containing any induced path on t vertices. So far, polynomial-time algorithms are known only for t at most 5 [Lokshtanov et al., SODA 2014, 570-581, 2014]. Here we study the existence of subexponential-time algorithms for the problem: by generalizing an earlier result of Randerath and Schiermeyer for t=5 [Discrete App. Math., 158 (2010), 1041-1044], we show that for any t at least 5, there is an algorithm for MAXIMUM INDEPENDENT SET on P_t-free graphs whose running time is subexponential in the number of vertices. SCATTERED SET is the generalization of MAXIMUM INDEPENDENT SET where the vertices of the solution are required to be at distance at least $d$ from each other. We give a complete characterization of those graphs H for which SCATTERED SET on H-free graphs can be solved in time subexponential in the size of the input (that is, in the number of vertices plus the number of edges): * If every component of H is a path, then d-SCATTERED SET on H-free graphs with n vertices and m edges can be solved in time 2^{(n+m)^{1-O(1/|V(H)|)}}, even if d is part of the input. * Otherwise, assuming ETH, there is no 2^{o(n+m)} time algorithm for d-SCATTERED SET for any fixed d at least 3 on H-free graphs with n vertices and m edges.

Cite as

Gábor Bacsó, Dániel Marx, and Zsolt Tuza. H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms. In 11th International Symposium on Parameterized and Exact Computation (IPEC 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 63, pp. 3:1-3:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2017)


Copy BibTex To Clipboard

@InProceedings{bacso_et_al:LIPIcs.IPEC.2016.3,
  author =	{Bacs\'{o}, G\'{a}bor and Marx, D\'{a}niel and Tuza, Zsolt},
  title =	{{H-Free Graphs, Independent Sets, and Subexponential-Time Algorithms}},
  booktitle =	{11th International Symposium on Parameterized and Exact Computation (IPEC 2016)},
  pages =	{3:1--3:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-023-1},
  ISSN =	{1868-8969},
  year =	{2017},
  volume =	{63},
  editor =	{Guo, Jiong and Hermelin, Danny},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.IPEC.2016.3},
  URN =		{urn:nbn:de:0030-drops-69397},
  doi =		{10.4230/LIPIcs.IPEC.2016.3},
  annote =	{Keywords: independent set, scattered set, subexponential algorithms, H-free graphs}
}
Document
Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221)

Authors: Jeff Erickson, Philip N. Klein, Dániel Marx, and Claire Mathieu

Published in: Dagstuhl Reports, Volume 6, Issue 5 (2016)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 16221 “Algorithms for Optimization Problems in Planar Graphs”. The seminar was held from May 29 to June 3, 2016. This report contains abstracts for the recent developments in planar graph algorithms discussed during the seminar as well as summaries of open problems in this area of research.

Cite as

Jeff Erickson, Philip N. Klein, Dániel Marx, and Claire Mathieu. Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221). In Dagstuhl Reports, Volume 6, Issue 5, pp. 94-113, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{erickson_et_al:DagRep.6.5.94,
  author =	{Erickson, Jeff and Klein, Philip N. and Marx, D\'{a}niel and Mathieu, Claire},
  title =	{{Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 16221)}},
  pages =	{94--113},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2016},
  volume =	{6},
  number =	{5},
  editor =	{Erickson, Jeff and Klein, Philip N. and Marx, D\'{a}niel and Mathieu, Claire},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.6.5.94},
  URN =		{urn:nbn:de:0030-drops-67227},
  doi =		{10.4230/DagRep.6.5.94},
  annote =	{Keywords: Algorithms, planar graphs, theory, approximation, fixed-parameter tractable, network flow, network design, kernelization}
}
Document
Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs

Authors: Dániel Marx, Ario Salmasi, and Anastasios Sidiropoulos

Published in: LIPIcs, Volume 60, Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)


Abstract
In the Asymmetric Traveling Salesperson Problem (ATSP) the goal is to find a closed walk of minimum cost in a directed graph visiting every vertex. We consider the approximability of ATSP on topologically restricted graphs. It has been shown by Oveis Gharan and Saberi [SODA, 2011] that there exists polynomial-time constant-factor approximations on planar graphs and more generally graphs of constant orientable genus. This result was extended to non-orientable genus by Erickson and Sidiropoulos [SoCG, 2014]. We show that for any class of nearly-embeddable graphs, ATSP admits a polynomial-time constant-factor approximation. More precisely, we show that for any fixed non-negative k, there exist positive alpha and beta, such that ATSP on n-vertex k-nearly-embeddable graphs admits an alpha-approximation in time O(n^beta). The class of k-nearly-embeddable graphs contains graphs with at most k apices, k vortices of width at most k, and an underlying surface of either orientable or non-orientable genus at most k. Prior to our work, even the case of graphs with a single apex was open. Our algorithm combines tools from rounding the Held-Karp LP via thin trees with dynamic programming. We complement our upper bounds by showing that solving ATSP exactly on graphs of pathwidth k (and hence on k-nearly embeddable graphs) requires time n^{Omega(k)}, assuming the Exponential-Time Hypothesis (ETH). This is surprising in light of the fact that both TSP on undirected graphs and Minimum Cost Hamiltonian Cycle on directed graphs are FPT parameterized by treewidth.

Cite as

Dániel Marx, Ario Salmasi, and Anastasios Sidiropoulos. Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs. In Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 60, pp. 16:1-16:54, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.APPROX-RANDOM.2016.16,
  author =	{Marx, D\'{a}niel and Salmasi, Ario and Sidiropoulos, Anastasios},
  title =	{{Constant-Factor Approximations for Asymmetric TSP on Nearly-Embeddable Graphs}},
  booktitle =	{Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2016)},
  pages =	{16:1--16:54},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-018-7},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{60},
  editor =	{Jansen, Klaus and Mathieu, Claire and Rolim, Jos\'{e} D. P. and Umans, Chris},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.APPROX-RANDOM.2016.16},
  URN =		{urn:nbn:de:0030-drops-66391},
  doi =		{10.4230/LIPIcs.APPROX-RANDOM.2016.16},
  annote =	{Keywords: asymmetric TSP, approximation algorithms, nearly-embeddable graphs, Held-Karp LP, exponential time hypothesis}
}
Document
The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems

Authors: Andreas Emil Feldmann and Dániel Marx

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
Given a directed graph G and a list (s_1, t_1), ..., (s_k, t_k) of terminal pairs, the Directed Steiner Network problem asks for a minimum-cost subgraph of G that contains a directed s_i -> t_i path for every 1 <= i <= k. The special case Directed Steiner Tree (when we ask for paths from a root r to terminals t_1, . . . , t_k) is known to be fixed-parameter tractable parameterized by the number of terminals, while the special case Strongly Connected Steiner Subgraph (when we ask for a path from every t_i to every other t_j ) is known to be W[1]-hard parameterized by the number of terminals. We systematically explore the complexity landscape of directed Steiner problems to fully understand which other special cases are FPT or W[1]-hard. Formally, if H is a class of directed graphs, then we look at the special case of Directed Steiner Network where the list (s_1, t_1), ..., (s_k, t_k) of requests form a directed graph that is a member of H. Our main result is a complete characterization of the classes H resulting in fixed-parameter tractable special cases: we show that if every pattern in H has the combinatorial property of being "transitively equivalent to a bounded-length caterpillar with a bounded number of extra edges," then the problem is FPT, and it is W[1]-hard for every recursively enumerable H not having this property. This complete dichotomy unifies and generalizes the known results showing that Directed Steiner Tree is FPT [Dreyfus and Wagner, Networks 1971], Strongly Connected Steiner Subgraph is W[1]-hard [Guo et al., SIAM J. Discrete Math. 2011], and Directed Steiner Network is solvable in polynomial-time for constant number of terminals [Feldman and Ruhl, SIAM J. Comput. 2006], and moreover reveals a large continent of tractable cases that were not known before.

Cite as

Andreas Emil Feldmann and Dániel Marx. The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 27:1-27:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{feldmann_et_al:LIPIcs.ICALP.2016.27,
  author =	{Feldmann, Andreas Emil and Marx, D\'{a}niel},
  title =	{{The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{27:1--27:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.27},
  URN =		{urn:nbn:de:0030-drops-63060},
  doi =		{10.4230/LIPIcs.ICALP.2016.27},
  annote =	{Keywords: Directed Steiner Tree, Directed Steiner Network, fixed-parameter tractability, dichotomy}
}
Document
Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth

Authors: Dániel Marx and Valia Mitsou

Published in: LIPIcs, Volume 55, 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)


Abstract
Choosability, introduced by Erdös, Rubin, and Taylor [Congr. Number. 1979], is a well-studied concept in graph theory: we say that a graph is c-choosable if for any assignment of a list of c colors to each vertex, there is a proper coloring where each vertex uses a color from its list. We study the complexity of deciding choosability on graphs of bounded treewidth. It follows from earlier work that 3-choosability can be decided in time 2^(2^(O(w)))*n^(O(1)) on graphs of treewidth w. We complement this result by a matching lower bound giving evidence that double-exponential dependence on treewidth may be necessary for the problem: we show that an algorithm with running time 2^(2^(o(w)))*n^(O(1)) would violate the Exponential-Time Hypothesis (ETH). We consider also the optimization problem where the task is to delete the minimum number of vertices to make the graph 4-choosable, and demonstrate that dependence on treewidth becomes tripleexponential for this problem: it can be solved in time 2^(2^(2^(O(w))))*n^(O(1)) on graphs of treewidth w, but an algorithm with running time 2^(2^(2^(o(w))))*n^(O(1)) would violate ETH.

Cite as

Dániel Marx and Valia Mitsou. Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth. In 43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 55, pp. 28:1-28:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.ICALP.2016.28,
  author =	{Marx, D\'{a}niel and Mitsou, Valia},
  title =	{{Double-Exponential and Triple-Exponential Bounds for Choosability Problems Parameterized by Treewidth}},
  booktitle =	{43rd International Colloquium on Automata, Languages, and Programming (ICALP 2016)},
  pages =	{28:1--28:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-013-2},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{55},
  editor =	{Chatzigiannakis, Ioannis and Mitzenmacher, Michael and Rabani, Yuval and Sangiorgi, Davide},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2016.28},
  URN =		{urn:nbn:de:0030-drops-63078},
  doi =		{10.4230/LIPIcs.ICALP.2016.28},
  annote =	{Keywords: Parameterized Complexity, List coloring, Treewidth, Lower bounds under ETH}
}
Document
Routing with Congestion in Acyclic Digraphs

Authors: Saeed Akhoondian Amiri, Stephan Kreutzer, Dániel Marx, and Roman Rabinovich

Published in: LIPIcs, Volume 58, 41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016)


Abstract
We study the version of the k-disjoint paths problem where k demand pairs (s_1,t_1), ..., (s_k,t_k) are specified in the input and the paths in the solution are allowed to intersect, but such that no vertex is on more than c paths. We show that on directed acyclic graphs the problem is solvable in time n^{O(d)} if we allow congestion k-d for k paths. Furthermore, we show that, under a suitable complexity theoretic assumption, the problem cannot be solved in time f(k)n^{o(d*log(d))} for any computable function f.

Cite as

Saeed Akhoondian Amiri, Stephan Kreutzer, Dániel Marx, and Roman Rabinovich. Routing with Congestion in Acyclic Digraphs. In 41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 58, pp. 7:1-7:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{amiri_et_al:LIPIcs.MFCS.2016.7,
  author =	{Amiri, Saeed Akhoondian and Kreutzer, Stephan and Marx, D\'{a}niel and Rabinovich, Roman},
  title =	{{Routing with Congestion in Acyclic Digraphs}},
  booktitle =	{41st International Symposium on Mathematical Foundations of Computer Science (MFCS 2016)},
  pages =	{7:1--7:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-016-3},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{58},
  editor =	{Faliszewski, Piotr and Muscholl, Anca and Niedermeier, Rolf},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.MFCS.2016.7},
  URN =		{urn:nbn:de:0030-drops-64244},
  doi =		{10.4230/LIPIcs.MFCS.2016.7},
  annote =	{Keywords: algorithms, disjoint paths, congestion, acyclic digraphs, XP, W\lbrack1\rbrack-hard}
}
Document
Fixed-Parameter Approximability of Boolean MinCSPs

Authors: Édouard Bonnet, László Egri, and Dániel Marx

Published in: LIPIcs, Volume 57, 24th Annual European Symposium on Algorithms (ESA 2016)


Abstract
The minimum unsatisfiability version of a constraint satisfaction problem (CSP) asks for an assignment where the number of unsatisfied constraints is minimum possible, or equivalently, asks for a minimum-size set of constraints whose deletion makes the instance satisfiable. For a finite set Gamma of constraints, we denote by CSP(Gamma) the restriction of the problem where each constraint is from Gamma. The polynomial-time solvability and the polynomial-time approximability of CSP(Gamma) were fully characterized by [Khanna et al. SICOMP 2000]. Here we study the fixed-parameter (FP-) approximability of the problem: given an instance and an integer k, one has to find a solution of size at most g(k) in time f(k)n^{O(1)} if a solution of size at most k exists. We especially focus on the case of constant-factor FP-approximability. Our main result classifies each finite constraint language Gamma into one of three classes: (1) CSP(Gamma) has a constant-factor FP-approximation; (2) CSP(Gamma) has a (constant-factor) FP-approximation if and only if Nearest Codeword has a (constant-factor) FP-approximation; (3) CSP(Gamma) has no FP-approximation, unless FPT=W[P]. We show that problems in the second class do not have constant-factor FP-approximations if both the Exponential-Time Hypothesis (ETH) and the Linear PCP Conjecture (LPC) hold. We also show that such an approximation would imply the existence of an FP-approximation for the k-Densest Subgraph problem with ratio 1-epsilon for any epsilon>0.

Cite as

Édouard Bonnet, László Egri, and Dániel Marx. Fixed-Parameter Approximability of Boolean MinCSPs. In 24th Annual European Symposium on Algorithms (ESA 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 57, pp. 18:1-18:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{bonnet_et_al:LIPIcs.ESA.2016.18,
  author =	{Bonnet, \'{E}douard and Egri, L\'{a}szl\'{o} and Marx, D\'{a}niel},
  title =	{{Fixed-Parameter Approximability of Boolean MinCSPs}},
  booktitle =	{24th Annual European Symposium on Algorithms (ESA 2016)},
  pages =	{18:1--18:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-015-6},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{57},
  editor =	{Sankowski, Piotr and Zaroliagis, Christos},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ESA.2016.18},
  URN =		{urn:nbn:de:0030-drops-63694},
  doi =		{10.4230/LIPIcs.ESA.2016.18},
  annote =	{Keywords: constraint satisfaction problems, approximability, fixed-parameter tractability}
}
Document
Invited Talk
The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems (Invited Talk)

Authors: Dániel Marx

Published in: LIPIcs, Volume 53, 15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016)


Abstract
Given a directed graph G and a list (s_1,t_1), ..., (s_k,t_k) of terminal pairs, the Directed Steiner Network problem asks for a minimum-cost subgraph of G that contains a directed s_i-> t_i path for every 1<= i <= k. Feldman and Ruhl presented an n^{O(k)} time algorithm for the problem, which shows that it is polynomial-time solvable for every fixed number k of demands. There are special cases of the problem that can be solved much more efficiently: for example, the special case Directed Steiner Tree (when we ask for paths from a root r to terminals t_1, ..., t_k) is known to be fixed-parameter tractable parameterized by the number of terminals, that is, algorithms with running time of the form f(k)*n^{O(1)} exist for the problem. On the other hand, the special case Strongly Connected Steiner Subgraph (when we ask for a path from every t_i to every other t_j) is known to be W[1]-hard parameterized by the number of terminals, hence it is unlikely to be fixed-parameter tractable. In the talk, we survey results on parameterized algorithms for special cases of Directed Steiner Network, including a recent complete classification result (joint work with Andreas Feldmann) that systematically explores the complexity landscape of directed Steiner problems to fully understand which special cases are FPT or W[1]-hard.

Cite as

Dániel Marx. The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems (Invited Talk). In 15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 53, p. 32:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.SWAT.2016.32,
  author =	{Marx, D\'{a}niel},
  title =	{{The Complexity Landscape of Fixed-Parameter Directed Steiner Network Problems}},
  booktitle =	{15th Scandinavian Symposium and Workshops on Algorithm Theory (SWAT 2016)},
  pages =	{32:1--32:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-011-8},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{53},
  editor =	{Pagh, Rasmus},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SWAT.2016.32},
  URN =		{urn:nbn:de:0030-drops-60535},
  doi =		{10.4230/LIPIcs.SWAT.2016.32},
  annote =	{Keywords: Directed Steiner Tree, Directed Steiner Network, fixed-parameter tractability, dichotomy}
}
Document
Peeling and Nibbling the Cactus: Subexponential-Time Algorithms for Counting Triangulations and Related Problems

Authors: Dániel Marx and Tillmann Miltzow

Published in: LIPIcs, Volume 51, 32nd International Symposium on Computational Geometry (SoCG 2016)


Abstract
Given a set of n points S in the plane, a triangulation T of S is a maximal set of non-crossing segments with endpoints in S. We present an algorithm that computes the number of triangulations on a given set of n points in time n^{ (11+ o(1)) sqrt{n} }, significantly improving the previous best running time of O(2^n n^2) by Alvarez and Seidel [SoCG 2013]. Our main tool is identifying separators of size O(sqrt{n}) of a triangulation in a canonical way. The definition of the separators are based on the decomposition of the triangulation into nested layers ("cactus graphs"). Based on the above algorithm, we develop a simple and formal framework to count other non-crossing straight-line graphs in n^{O(sqrt{n})} time. We demonstrate the usefulness of the framework by applying it to counting non-crossing Hamilton cycles, spanning trees, perfect matchings, 3-colorable triangulations, connected graphs, cycle decompositions, quadrangulations, 3-regular graphs, and more.

Cite as

Dániel Marx and Tillmann Miltzow. Peeling and Nibbling the Cactus: Subexponential-Time Algorithms for Counting Triangulations and Related Problems. In 32nd International Symposium on Computational Geometry (SoCG 2016). Leibniz International Proceedings in Informatics (LIPIcs), Volume 51, pp. 52:1-52:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.SoCG.2016.52,
  author =	{Marx, D\'{a}niel and Miltzow, Tillmann},
  title =	{{Peeling and Nibbling the Cactus:  Subexponential-Time Algorithms for Counting Triangulations and Related Problems}},
  booktitle =	{32nd International Symposium on Computational Geometry (SoCG 2016)},
  pages =	{52:1--52:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-009-5},
  ISSN =	{1868-8969},
  year =	{2016},
  volume =	{51},
  editor =	{Fekete, S\'{a}ndor and Lubiw, Anna},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.SoCG.2016.52},
  URN =		{urn:nbn:de:0030-drops-59445},
  doi =		{10.4230/LIPIcs.SoCG.2016.52},
  annote =	{Keywords: computational geometry, triangulations, exponential-time algorithms}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301)

Authors: Andrei A. Bulatov, Venkatesan Guruswami, Andrei Krokhin, and Dániel Marx

Published in: Dagstuhl Reports, Volume 5, Issue 7 (2016)


Abstract
During the past two decades, an impressive array of diverse methods from several different mathematical fields, including algebra, logic, mathematical programming, probability theory, graph theory, and combinatorics, have been used to analyze both the computational complexity and approximabilty of algorithmic tasks related to the constraint satisfaction problem (CSP), as well as the applicability/limitations of algorithmic techniques. This research direction develops at an impressive speed, regularly producing very strong and general results. The Dagstuhl Seminar 15301 "The Constraint Satisfaction Problem: Complexity and Approximability" was aimed at bringing together researchers using all the different techniques in the study of the CSP, so that they can share their insights obtained during the past three years. This report documents the material presented during the course of the seminar.

Cite as

Andrei A. Bulatov, Venkatesan Guruswami, Andrei Krokhin, and Dániel Marx. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301). In Dagstuhl Reports, Volume 5, Issue 7, pp. 22-41, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016)


Copy BibTex To Clipboard

@Article{bulatov_et_al:DagRep.5.7.22,
  author =	{Bulatov, Andrei A. and Guruswami, Venkatesan and Krokhin, Andrei and Marx, D\'{a}niel},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 15301)}},
  pages =	{22--41},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2016},
  volume =	{5},
  number =	{7},
  editor =	{Bulatov, Andrei A. and Guruswami, Venkatesan and Krokhin, Andrei and Marx, D\'{a}niel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.5.7.22},
  URN =		{urn:nbn:de:0030-drops-56714},
  doi =		{10.4230/DagRep.5.7.22},
  annote =	{Keywords: Constraint satisfaction problem (CSP), Computational complexity, CSP dichotomy conjecture, Hardness of approximation, Unique games conjecture, Fixed-parameter tractability, Descriptive complexity, Universal algebra, Logic, Decomposition methods}
}
Document
Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451)

Authors: Stefan Kratsch, Daniel Lokshtanov, Dániel Marx, and Peter Rossmanith

Published in: Dagstuhl Reports, Volume 4, Issue 11 (2015)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 14451 "Optimality and tight results in parameterized complexity". Over the last two decades parameterized complexity has become one of the main tools for handling intractable problems. Recently, tools have been developed not only to classify problems, but also to make statements about how close an algorithm is to being optimal with respect to running time. The focus of this seminar is to highlight and discuss recent, relevant results within this optimality framework and discover fruitful research directions. The report contains the abstracts of the results presented at the seminar, as well as a collection of open problems stated at the seminar.

Cite as

Stefan Kratsch, Daniel Lokshtanov, Dániel Marx, and Peter Rossmanith. Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451). In Dagstuhl Reports, Volume 4, Issue 11, pp. 1-21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2015)


Copy BibTex To Clipboard

@Article{kratsch_et_al:DagRep.4.11.1,
  author =	{Kratsch, Stefan and Lokshtanov, Daniel and Marx, D\'{a}niel and Rossmanith, Peter},
  title =	{{Optimality and tight results in parameterized complexity (Dagstuhl Seminar 14451)}},
  pages =	{1--21},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2015},
  volume =	{4},
  number =	{11},
  editor =	{Kratsch, Stefan and Lokshtanov, Daniel and Marx, D\'{a}niel and Rossmanith, Peter},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.4.11.1},
  URN =		{urn:nbn:de:0030-drops-49677},
  doi =		{10.4230/DagRep.4.11.1},
  annote =	{Keywords: Algorithms, parameterized complexity, kernels, width measures, exponential time hypothesis, lower bounds}
}
Document
Chordal Editing is Fixed-Parameter Tractable

Authors: Yixin Cao and Dániel Marx

Published in: LIPIcs, Volume 25, 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)


Abstract
Graph modification problems are typically asked as follows: is there a set of k operations that transforms a given graph to have a certain property. The most commonly considered operations include vertex deletion, edge deletion, and edge addition; for the same property, one can define significantly different versions by allowing different operations. We study a very general graph modification problem which allows all three types of operations: given a graph G and integers k_1, k_2, and k_3, the CHORDAL EDITING problem asks if G can be transformed into a chordal graph by at most k_1 vertex deletions, k_2 edge deletions, and k_3 edge additions. Clearly, this problem generalizes both CHORDAL VERTEX/EDGE DELETION and CHORDAL COMPLETION (also known as MINIMUM FILL-IN). Our main result is an algorithm for CHORDAL EDITING in time 2^O(k.log(k))·n^O(1), where k:=k_1+k_2+k_3; therefore, the problem is fixed-parameter tractable parameterized by the total number of allowed operations. Our algorithm is both more efficient and conceptually simpler than the previously known algorithm for the special case CHORDAL DELETION.

Cite as

Yixin Cao and Dániel Marx. Chordal Editing is Fixed-Parameter Tractable. In 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 25, pp. 214-225, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{cao_et_al:LIPIcs.STACS.2014.214,
  author =	{Cao, Yixin and Marx, D\'{a}niel},
  title =	{{Chordal Editing is Fixed-Parameter Tractable}},
  booktitle =	{31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)},
  pages =	{214--225},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-65-1},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{25},
  editor =	{Mayr, Ernst W. and Portier, Natacha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2014.214},
  URN =		{urn:nbn:de:0030-drops-44591},
  doi =		{10.4230/LIPIcs.STACS.2014.214},
  annote =	{Keywords: chordal graph, parameterized computation, graph modification problems, chordal deletion, chordal completion, clique tree decomposition, holes, simplic}
}
Document
Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask)

Authors: Dániel Marx and Michal Pilipczuk

Published in: LIPIcs, Volume 25, 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)


Abstract
Given two graphs H and G, the Subgraph Isomorphism problem asks if H is isomorphic to a subgraph of G. While NP-hard in general, algorithms exist for various parameterized versions of the problem. However, the literature contains very little guidance on which combinations of parameters can or cannot be exploited algorithmically. Our goal is to systematically investigate the possible parameterized algorithms that can exist for Subgraph Isomorphism. We develop a framework involving 10 relevant parameters for each of H and G (such as treewidth, pathwidth, genus, maximum degree, number of vertices, number of components, etc.), and ask if an algorithm with running time f1_(p_1,p_2,...,p_l).n^f_2(p_(l+1),...,p_k) exists, where each of p_1,...,p_k is one of the 10 parameters depending only on H or G. We show that all the questions arising in this framework are answered by a set of 11 maximal positive results (algorithms) and a set of 17 maximal negative results (hardness proofs); some of these results already appear in the literature, while others are new in this paper. On the algorithmic side, our study reveals for example that an unexpected combination of bounded degree, genus, and feedback vertex set number of G gives rise to a highly nontrivial algorithm for Subgraph Isomorphism. On the hardness side, we present W[1]-hardness proofs under extremely restricted conditions, such as when H is a bounded-degree tree of constant pathwidth and G is a planar graph of bounded pathwidth.

Cite as

Dániel Marx and Michal Pilipczuk. Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask). In 31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014). Leibniz International Proceedings in Informatics (LIPIcs), Volume 25, pp. 542-553, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.STACS.2014.542,
  author =	{Marx, D\'{a}niel and Pilipczuk, Michal},
  title =	{{Everything you always wanted to know about the parameterized complexity of Subgraph Isomorphism (but were afraid to ask)}},
  booktitle =	{31st International Symposium on Theoretical Aspects of Computer Science (STACS 2014)},
  pages =	{542--553},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-65-1},
  ISSN =	{1868-8969},
  year =	{2014},
  volume =	{25},
  editor =	{Mayr, Ernst W. and Portier, Natacha},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2014.542},
  URN =		{urn:nbn:de:0030-drops-44863},
  doi =		{10.4230/LIPIcs.STACS.2014.542},
  annote =	{Keywords: parameterized complexity, subgraph isomorphism}
}
Document
Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421)

Authors: Glencora Borradaile, Philp Klein, Dániel Marx, and Claire Mathieu

Published in: Dagstuhl Reports, Volume 3, Issue 10 (2014)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 13421 "Algorithms for Optimization Problems in Planar Graphs". The seminar was held from October 13 to October 18, 2013. This report contains abstracts for the recent developments in planar graph algorithms discussed during the seminar as well as summaries of open problems in this area of research.

Cite as

Glencora Borradaile, Philp Klein, Dániel Marx, and Claire Mathieu. Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421). In Dagstuhl Reports, Volume 3, Issue 10, pp. 36-57, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)


Copy BibTex To Clipboard

@Article{borradaile_et_al:DagRep.3.10.36,
  author =	{Borradaile, Glencora and Klein, Philp and Marx, D\'{a}niel and Mathieu, Claire},
  title =	{{Algorithms for Optimization Problems in Planar Graphs (Dagstuhl Seminar 13421)}},
  pages =	{36--57},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2014},
  volume =	{3},
  number =	{10},
  editor =	{Borradaile, Glencora and Klein, Philp and Marx, D\'{a}niel and Mathieu, Claire},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.3.10.36},
  URN =		{urn:nbn:de:0030-drops-44274},
  doi =		{10.4230/DagRep.3.10.36},
  annote =	{Keywords: Algorithms, planar graphs, theory, approximation, fixed-parameter tractable, network flow, network design, kernelization}
}
Document
The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451)

Authors: Johan Hastad, Andrei Krokhin, and Dániel Marx

Published in: Dagstuhl Reports, Volume 2, Issue 11 (2013)


Abstract
During the past two decades, an impressive array of diverse methods from several different mathematical fields, including algebra, logic, analysis, probability theory, graph theory, and combinatorics, have been used to analyze both the computational complexity and approximabilty of algorithmic tasks related to the constraint satisfaction problem (CSP), as well as the applicability/limitations of algorithmic techniques. The Dagstuhl Seminar 12451 ``The Constraint Satisfaction Problem: Complexity and Approximability'' was aimed at bringing together researchers using all the different techniques in the study of the CSP, so that they can share their insights. This report documents the material presented during the course of the seminar.

Cite as

Johan Hastad, Andrei Krokhin, and Dániel Marx. The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451). In Dagstuhl Reports, Volume 2, Issue 11, pp. 1-19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@Article{hastad_et_al:DagRep.2.11.1,
  author =	{Hastad, Johan and Krokhin, Andrei and Marx, D\'{a}niel},
  title =	{{The Constraint Satisfaction Problem: Complexity and Approximability (Dagstuhl Seminar 12451)}},
  pages =	{1--19},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2013},
  volume =	{2},
  number =	{11},
  editor =	{Hastad, Johan and Krokhin, Andrei and Marx, D\'{a}niel},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.2.11.1},
  URN =		{urn:nbn:de:0030-drops-39764},
  doi =		{10.4230/DagRep.2.11.1},
  annote =	{Keywords: Constraint satisfaction problem (CSP); Computational complexity; CSP dichotomy conjecture; Hardness of approximation; Unique games conjceture; Fixed-parameter tractability; Descriptive complexity; niversal algebra; Logic; Decomposition methods}
}
Document
Tutorial
Algorithmic Graph Structure Theory (Tutorial)

Authors: Dániel Marx

Published in: LIPIcs, Volume 20, 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)


Abstract
The Graph Minors project of Robertson and Seymour uncovered a very deep structural theory of graphs. This theory had several important consequences, among others, the proof of Wagner's Conjecture. While the whole theory, presented in a series of 23 very dense papers, is notoriously difficult to understand, it has to be emphasized that these papers introduced several elementary concepts and tools that had strong impact on algorithms, complexity, and combinatorics. Moreover, even some of the very deep results can be stated in a compact and useful way, and it is possible to build upon these results without a complete understanding of the underlying machinery. In the first part of the lecture, I will introduce the concept of treewidth, which can be thought of as an elementary entry point to graph minors theory. I will overview its graph-theoretic and algorithmic properties that make it especially important in the design of parameterized algorithms and approximation schemes on planar graphs. Furthermore, I will briefly explain some of the connections of treewidth to complexity and automata theory. In the next part of the lecture, we will turn our attention to the more advanced topic of graphs excluding a fixed minor: the structure of such graphs, finding minors, and the well-quasi-ordering of the minor relation. The primary goal here is to provide clear and useful statements of these results and to show how they generalize the concepts of treewidth and planar graphs. Finally, I will briefly overview some more recent results involving different kinds of excluded structures, such as graphs excluding odd minors and topological minors.

Cite as

Dániel Marx. Algorithmic Graph Structure Theory (Tutorial). In 30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013). Leibniz International Proceedings in Informatics (LIPIcs), Volume 20, p. 7, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2013)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.STACS.2013.7,
  author =	{Marx, D\'{a}niel},
  title =	{{Algorithmic Graph Structure Theory}},
  booktitle =	{30th International Symposium on Theoretical Aspects of Computer Science (STACS 2013)},
  pages =	{7--7},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-50-7},
  ISSN =	{1868-8969},
  year =	{2013},
  volume =	{20},
  editor =	{Portier, Natacha and Wilke, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2013.7},
  URN =		{urn:nbn:de:0030-drops-39175},
  doi =		{10.4230/LIPIcs.STACS.2013.7},
  annote =	{Keywords: Graph theory, graph minors, structure theorems}
}
Document
Data Reduction and Problem Kernels (Dagstuhl Seminar 12241)

Authors: Michael R. Fellows, Jiong Guo, Dániel Marx, and Saket Saurabh

Published in: Dagstuhl Reports, Volume 2, Issue 6 (2012)


Abstract
This report documents the program and the outcomes of Dagstuhl Seminar 12241 ``Data Reduction and Problem Kernels''. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Michael R. Fellows, Jiong Guo, Dániel Marx, and Saket Saurabh. Data Reduction and Problem Kernels (Dagstuhl Seminar 12241). In Dagstuhl Reports, Volume 2, Issue 6, pp. 26-50, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2012)


Copy BibTex To Clipboard

@Article{fellows_et_al:DagRep.2.6.26,
  author =	{Fellows, Michael R. and Guo, Jiong and Marx, D\'{a}niel and Saurabh, Saket},
  title =	{{Data Reduction and Problem Kernels (Dagstuhl Seminar 12241)}},
  pages =	{26--50},
  journal =	{Dagstuhl Reports},
  ISSN =	{2192-5283},
  year =	{2012},
  volume =	{2},
  number =	{6},
  editor =	{Fellows, Michael R. and Guo, Jiong and Marx, D\'{a}niel and Saurabh, Saket},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagRep.2.6.26},
  URN =		{urn:nbn:de:0030-drops-37297},
  doi =		{10.4230/DagRep.2.6.26},
  annote =	{Keywords: Preprocessing, Fixed-parameter tractability, Parameterized algorithmics}
}
Document
Treewidth Reduction for Constrained Separation and Bipartization Problems

Authors: Dániel Marx, Barry O'Sullivan, and Igor Razgon

Published in: LIPIcs, Volume 5, 27th International Symposium on Theoretical Aspects of Computer Science (2010)


Abstract
We present a method for reducing the treewidth of a graph while preserving all the minimal $s-t$ separators. This technique turns out to be very useful for establishing the fixed-parameter tractability of constrained separation and bipartization problems. To demonstrate the power of this technique, we prove the fixed-parameter tractability of a number of well-known separation and bipartization problems with various additional restrictions (e.g., the vertices being removed from the graph form an independent set). These results answer a number of open questions in the area of parameterized complexity.

Cite as

Dániel Marx, Barry O'Sullivan, and Igor Razgon. Treewidth Reduction for Constrained Separation and Bipartization Problems. In 27th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 5, pp. 561-572, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{marx_et_al:LIPIcs.STACS.2010.2485,
  author =	{Marx, D\'{a}niel and O'Sullivan, Barry and Razgon, Igor},
  title =	{{Treewidth Reduction for Constrained Separation and Bipartization Problems}},
  booktitle =	{27th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{561--572},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-16-3},
  ISSN =	{1868-8969},
  year =	{2010},
  volume =	{5},
  editor =	{Marion, Jean-Yves and Schwentick, Thomas},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2010.2485},
  URN =		{urn:nbn:de:0030-drops-24850},
  doi =		{10.4230/LIPIcs.STACS.2010.2485},
  annote =	{Keywords: Fixed-parameter algorithms, graph separation problems, treewidth}
}
Document
09511 Abstracts Collection – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
From 14. 12. 2009 to 17. 12. 2009., the Dagstuhl Seminar 09511 ``Parameterized complexity and approximation algorithms '' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Abstracts Collection – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.1,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Abstracts Collection – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.1},
  URN =		{urn:nbn:de:0030-drops-25025},
  doi =		{10.4230/DagSemProc.09511.1},
  annote =	{Keywords: Parameterized complexity, Approximation algorithms}
}
Document
09511 Executive Summary – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
Many of the computational problems that arise in practice are optimization problems: the task is to find a solution where the cost, quality, size, profit, or some other measure is as large or small as possible. The NP-hardness of an optimization problem implies that, unless P = NP, there is no polynomial-time algorithm that finds the exact value of the optimum. Various approaches have been proposed in the literature to cope with NP-hard problems. When designing approximation algorithms, we relax the requirement that the algorithm produces an optimum solution, and our aim is to devise a polynomial-time algorithm such that the solution it produces is not necessarily optimal, but there is some worst-case bound on the solution quality.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Executive Summary – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.2,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Executive Summary – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.2},
  URN =		{urn:nbn:de:0030-drops-25011},
  doi =		{10.4230/DagSemProc.09511.2},
  annote =	{Keywords: Parameterized complexity, Approximation algorithms}
}
Document
09511 Open Problems – Parameterized complexity and approximation algorithms

Authors: Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx

Published in: Dagstuhl Seminar Proceedings, Volume 9511, Parameterized complexity and approximation algorithms (2010)


Abstract
The paper contains a list of the problems presented on Monday, December 14, 2009 at the open problem session of the Seminar on Parameterized Complexity and Approximation Algorithms, held at Schloss Dagstuhl in Wadern, Germany.

Cite as

Erik D. Demaine, MohammadTaghi Hajiaghayi, and Dániel Marx. 09511 Open Problems – Parameterized complexity and approximation algorithms. In Parameterized complexity and approximation algorithms. Dagstuhl Seminar Proceedings, Volume 9511, pp. 1-10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2010)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.09511.3,
  author =	{Demaine, Erik D. and Hajiaghayi, MohammadTaghi and Marx, D\'{a}niel},
  title =	{{09511 Open Problems – Parameterized complexity and approximation algorithms}},
  booktitle =	{Parameterized complexity and approximation algorithms},
  pages =	{1--10},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2010},
  volume =	{9511},
  editor =	{Erik D. Demaine and MohammadTaghi Hajiaghayi and D\'{a}niel Marx},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.09511.3},
  URN =		{urn:nbn:de:0030-drops-24992},
  doi =		{10.4230/DagSemProc.09511.3},
  annote =	{Keywords: Parameterized complexity, approximation algorithms, open problems}
}
Document
Enumerating Homomorphisms

Authors: Andrei A. Bulatov, Victor Dalmau, Martin Grohe, and Daniel Marx

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The homomorphism problem for relational structures is an abstract way of formulating constraint satisfaction problems (CSP) and various problems in database theory. The decision version of the homomorphism problem received a lot of attention in literature; in particular, the way the graph-theoretical structure of the variables and constraints influences the complexity of the problem is intensively studied. Here we study the problem of enumerating all the solutions with polynomial delay from a similar point of view. It turns out that the enumeration problem behaves very differently from the decision version. We give evidence that it is unlikely that a characterization result similar to the decision version can be obtained. Nevertheless, we show nontrivial cases where enumeration can be done with polynomial delay.

Cite as

Andrei A. Bulatov, Victor Dalmau, Martin Grohe, and Daniel Marx. Enumerating Homomorphisms. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 231-242, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{bulatov_et_al:LIPIcs.STACS.2009.1838,
  author =	{Bulatov, Andrei A. and Dalmau, Victor and Grohe, Martin and Marx, Daniel},
  title =	{{Enumerating Homomorphisms}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{231--242},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1838},
  URN =		{urn:nbn:de:0030-drops-18385},
  doi =		{10.4230/LIPIcs.STACS.2009.1838},
  annote =	{Keywords: }
}
Document
Tractable Structures for Constraint Satisfaction with Truth Tables

Authors: Daniel Marx

Published in: LIPIcs, Volume 3, 26th International Symposium on Theoretical Aspects of Computer Science (2009)


Abstract
The way the graph structure of the constraints influences the complexity of constraint satisfaction problems (CSP) is well understood for bounded-arity constraints. The situation is less clear if there is no bound on the arities. In this case the answer depends also on how the constraints are represented in the input. We study this question for the truth table representation of constraints. We introduce a new hypergraph measure {\em adaptive width} and show that CSP with truth tables is polynomial-time solvable if restricted to a class of hypergraphs with bounded adaptive width. Conversely, assuming a conjecture on the complexity of binary CSP, there is no other polynomial-time solvable case.

Cite as

Daniel Marx. Tractable Structures for Constraint Satisfaction with Truth Tables. In 26th International Symposium on Theoretical Aspects of Computer Science. Leibniz International Proceedings in Informatics (LIPIcs), Volume 3, pp. 649-660, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2009)


Copy BibTex To Clipboard

@InProceedings{marx:LIPIcs.STACS.2009.1807,
  author =	{Marx, Daniel},
  title =	{{Tractable Structures for Constraint Satisfaction with Truth Tables}},
  booktitle =	{26th International Symposium on Theoretical Aspects of Computer Science},
  pages =	{649--660},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-939897-09-5},
  ISSN =	{1868-8969},
  year =	{2009},
  volume =	{3},
  editor =	{Albers, Susanne and Marion, Jean-Yves},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.STACS.2009.1807},
  URN =		{urn:nbn:de:0030-drops-18079},
  doi =		{10.4230/LIPIcs.STACS.2009.1807},
  annote =	{Keywords: Computational complexity, Constraint satisfaction, Treewidth, Adaptive width}
}
Document
07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs

Authors: Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege

Published in: Dagstuhl Seminar Proceedings, Volume 7281, Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs (2007)


Abstract
The following is a list of the problems presented on Monday, July 9, 2007 at the open-problem session of the Seminar on Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs, held at Schloss Dagstuhl in Wadern, Germany.

Cite as

Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege. 07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs. In Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. Dagstuhl Seminar Proceedings, Volume 7281, pp. 1-6, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.07281.2,
  author =	{Demaine, Erik and Gutin, Gregory Z. and Marx, Daniel and Stege, Ulrike},
  title =	{{07281 Open Problems – Structure Theory and FPT Algorithmcs for Graphs, Digraphs and Hypergraphs}},
  booktitle =	{Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs},
  pages =	{1--6},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7281},
  editor =	{Erik Demaine and Gregory Z. Gutin and Daniel Marx and Ulrike Stege},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07281.2},
  URN =		{urn:nbn:de:0030-drops-12542},
  doi =		{10.4230/DagSemProc.07281.2},
  annote =	{Keywords: }
}
Document
07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs

Authors: Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege

Published in: Dagstuhl Seminar Proceedings, Volume 7281, Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs (2007)


Abstract
From 8th to 13th July 2007, the Dagstuhl Seminar ``Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available.

Cite as

Erik Demaine, Gregory Z. Gutin, Daniel Marx, and Ulrike Stege. 07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. In Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs. Dagstuhl Seminar Proceedings, Volume 7281, pp. 1-14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:DagSemProc.07281.1,
  author =	{Demaine, Erik and Gutin, Gregory Z. and Marx, Daniel and Stege, Ulrike},
  title =	{{07281 Abstracts Collection – Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs}},
  booktitle =	{Structure Theory and FPT Algorithmics for Graphs, Digraphs and Hypergraphs},
  pages =	{1--14},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7281},
  editor =	{Erik Demaine and Gregory Z. Gutin and Daniel Marx and Ulrike Stege},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07281.1},
  URN =		{urn:nbn:de:0030-drops-12450},
  doi =		{10.4230/DagSemProc.07281.1},
  annote =	{Keywords: Parameterized complexity, fixed-parameter tractability, graph structure theory}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail